I detta avsnitt av The Nerd Herd fortsätter vi pratet om Sovereign Cloud. Nu har vi fått lite mer detaljer om hur den lokala varianten av Azure kommer att funka.
Sometimes when you customize a mapping (in this example DV Released Products), you will get an error similar to Project validation failed. [DIPV1004]… this error comes from Data Integrator in Dataverse, which is the engine that drives DualWrite.
“Integration key field isn’t bi-directionally mapped“
The entitiy in my case was DV DV Released products, where I had added a field for Search Name.
Lets first look at the Integration Keys for the DV Released Products / msdyn_sharedproductdetails
The fields that are Integration Keys in DualWrite are ItemNumber and Company and when we look at those fields in the DW mapping, ItemNumber is mapped 1-way, this means that no other fields, inte mapping profile, can be mapped 2-way. (The company Code is almost always in the mapping due to the concept of Legal Entities).
In my case, the SearchName field was accidentally mapped two way. When I fixed it, the error went away and I was able to do an Initial Sync
When you map products between Dyanmics 365 for Finance and Supply Chain and Dynamics 365 for Sales there are a couple of basic concepts that are important to think about.
Products
Product in FnO contains master data. It is also a global table, available in all Legal Entities. To sell a “product” we first need to release it in a Legal Entity. Dualwrite syncs these to the table msdyn_globalproducts.
Released Products
A Released Products, also known as an Item is a product that is released to a Legal Entity. In D365FO, Microsoft has created a custom entity that is used for syncing Released Products to CRM, called DV Released products. In the Dualwrite mapping, it is syncing to the table called msdyn_sharedproductdetails.
Released Distinct Products
As you notice, none of the dataverse tables above, are native CRM tables for products (the hint here is the prefix msdyn_). Du to the difference in data structure between ERP and CRM, Microsoft created and additional Data Entity in FnO that we can sync more easily to CRM. DV Released distinct products is basically a copy of Released Products, with some added metadata such as Configuration, Color and Size.
Sometimes when you set up Dualwrite, lines in the databases does not sync correctly in the Initial Sync and then you need to help them along a bit. In order to “touch” (edit them to force a sync) Products and Released Products in FnO you simply go to the UI for these in Product information management. There is no “workspace” for Distinct Released Products.
In order to trigger a sync of a Released Distinct Product to Products, you need to touch one of these fields:
The easiest is to go to Products (not the Released Product) in FnO and edit the Description Field. That will trigger a sync.
When setting up Dualwrite for orders and quotes and doing initial sync, sometimes all order/quote lines do not sync over correctly. Especially if you have historical orders that are already delivered.
If an order is delivered i FO and the head of the quote/order is synced to CE, the lines will not sync. The result is that whenever you try to do something to that order (for instance invoice it), there is logic in CE, that sees that is Delivered and thus sets it to read-only. Depending on if this happens during initial sync or during Live sync, there are different methods to fix this.
Initial Sync
While you are initial syncing quotes/orders you can use a workaround in order to correctly sync lines to an invoices/delivered order/quote.
The trick is to manipulate the sync mapping for the order/quote header and set all headers, temporarily, to open. NOTE: this is only temporary in order to get the lines to sync correctly.
1. Open the DualWrite workspace and go the mapping for Dynamics 365 Sales order headers (salesorders).
2. If it is started, stop the mapping.
3. Find the SALESORDERPROCESSINGSTATUS line and change delivered and invoiced to 192350000 (this will set them to active in CRM).
4. Click Save as to save the mapping. Give it a good description so you don´t use it by mistake.
5. Perform the initial sync for Dynamics 365 Sales order headers (salesorders).
6. Sync Dynamics 365 Sales order lines (salesorderdetails)
7. Set the mapping for Dynamics 365 Sales order headers (salesorders) back to the correct mapping and perform a new initial sync.
8. Start the mappings for Dynamics 365 Sales order headers (salesorders) and Dynamics 365 Sales order lines (salesorderdetails) without syncing it again.
Live Sync
When you are already live with the solution and you find an order that is missing lines, the issue is that you can not run initial sync, because that would destroy the data.
To figure out if this is your issue, open the order in FnO and CE and compare them. If the order is missing lines or the entire order is missing from CE, go through these steps:
1. Open the order in FnO 2. Select the first line in the order and click Update line – Deliver remainder
3. Set the Sales Quantity and Inventory Quantity to 1 and click OK
4. The status of the order is not set to Open Order and the order should exist in CE with the line you made the change for.
5. If there are more lines on the order in FnO which have not synced to CE, make a small edit to the line (add a . to the Text field). This will force it to sync. 6. When the order is correct in CE, go back to the first line in FnO and click Update Line – Deliver remainder. Click Cancel quantity.
We are working with a a couple of customers that are running Dualwrite to sync between ERP and CRM.
Today one of my colleagues called me and told me he could not create an account in an environment at one of these customers… he got a Dualwrite error 😮
Dual Write core application error-SecureConfig Organization (XXXX) does not match actual Dataverse Organization (YYYY)
(the names have been changed to protect the innocent)
The weird thing is that this particular environment is not even connected with Dualwrite. It does not even have a D365FO app installed. It has however been refreshed from an environment that has Dualwrite active.
The best practice, when refreshing environments with Dualwrite is that you reset the Dualwrite configuration and activate all of the mappings again. The reset is done from within the FnO Dualwrite configuration screen:
Since this environment does not have a FnO app, there is no way to reset the configuration. We need to do this the hard way 🙂
To fix this, do the following:
1. Browse to https://make.powerapps.com and select the correct environment
2. Go to Tables, view all tables and find the table called Dual Write Runtime Configuration. This table contains the Dualwrite configuration. Usually when you are resetting the Dualwrite connection after a refresh this table is emptied.
3. Open the table, select all the rows and delete them.
I detta avsnitt av The Nerd Herd pratar grabbarna grus om identiteter i molnet. Vi reder ut skillnad pĂĄ Application Registrations, Enterprise Applications och Service Principals… mest för att Johan skall förstĂĄ
When LCS is being deprecated eventually, we will loose our main source of out of the box monitoring and telemetry for Dynamics 365 for Finance and Supply Chain. Fortunately Microsoft has a plan and D365Fo is now able to connect and sent telemetry to Azure Application insights.
Application Insight is Microsoft’s service for Telemetry and Insights. It receives data, stores it in a database and you run queries against it and build dashboards to see health of your environment. Most Azure Services can also sent telemetry to Application Insights, this means that you will have a "single pane of glass" where you can see D365FO and Integration Services in the same timeline.
When I talk to my customers, most of them think that is a hard thing to set up… in this post I will show the steps and do it in 15 minutes:
Create a new App Insights Instance
Wait for it to finish
Set retention of data. If you are doing the setup for test, dev or lab it is a great idea to limit the amount of data that is saved (since that is what drives cost)
Go to the Analytics Workspace
Click Usage and Estimated Cost
Set a Daily Cap to limit the amount of data collected per day
Set the time data will be retained
In FnO, activate the Feature "Monitoring and Telemetry"
In Fno Go to Monitoring and Telemetry in System Administration
Activate all the checkboxes
In Environments, create your environemtn
ID in LCS or in PPAC
Set Mode
Save
In Application Insight Registry
Enter instrumentation Key from Azure
Enter the Connection String
Verify functionality
Click around in FnO
In the Log Analytics Workspace, click Logs and dismiss the pop-up
Click Page Views and choose run
Now you have data 🙂
This was a really quick setup (it actually took 15 minutes, including the initial draft of the blog post) of App Insights and as you might imagine now we are at the point where the real work starts. Making sense of your data is the real job.
Microsoft Fasttrack has released a bunch of ready-made reports and dashboards that you can use to get started. These are made for Azure Data Analytics which I will set up in a later blogpost. There are also a whole bunch of new feature coming that we will also look into in later posts.
Good Luck getting started with Application Insights
Since Microsoft released I have made heavy use of a Excel Addin for Dynamics 365 for Finance and Supply Chain. It is really an amazing tool for performing bulk edits.
One of the issues that I see a lot when setting up DualWrite for customers, is that Customers have their own account number set as pointing to themselves
This is not an option since Dataverse does not allow circle references in a table. As far as I have understood (when asking my finance colleagues) emptying the invoice account field is not a problem, since the account defaults to itself as invoice account. The can however be accounts that have a different invoice account set, and those of course should not be removed.
I usually use Excel Addin to identify and remove the invoice account where it should be removed.
Open Customers in Excel
Create a Conditional Access Rule
From the Banner Meny go to Home – Conditional Formatting – Manage Rules
2. Click New Rule… Choose Use Formula to determine which cells to format
3. Enter the formula =$D2=$G2 and choose Format to change formatting to Fill with Red
A while ago, one of my colleagues contacted me with a login issue.
The colleague was trying to log into a customers environment (where she is a guest user) with her company account (from our company). When trying to login, this error appeared:
In Microsoft Entra ID, the is a functionality called Risky Sign-ins. Users that try to log in a “weird” way, for instance login from a Dutch IP one minute after login from a Swedish IP (there is a link at the bottom of the post with more detailed information.
This information is logged in EntraID and (sometimes) acted upon. The reason i say sometimes, is because the action part requires an EntraID P2 license.
When the colleague contacted me, I thought that this should not be happening, because we do not have EntraID P2 licenses for our users.
After some digging and looking at the colleagues account, in the Azure Portal, i saw that it had a risky login and when I cleared it, it started working.
Apparently the information about risky login follows the guest accounts over to our customers tenant and the customer had enforcement of risky users enabled… well, I learned something today as well 🙂
Microsoft has announced that all Basic SKU Public IP addresses in Azure will be retired on 30 September 2025. If you’re currently using Basic SKU IPs, it’s time to start planning your upgrade to the Standard SKU to avoid service disruptions. This change affects virtual machines, load balancers, and other resources relying on Basic IPs—so early action is key.
Basic vs Standard Public IPs (Source: Microsoft Learn)
Aspect
Standard SKU Public IP
Basic SKU Public IP
Allocation Method
Static
IPv4: Dynamic or Static<br>IPv6: Dynamic
Security Model
Secure by default (closed to inbound traffic unless explicitly allowed via NSG)
Open by default (NSG optional)
Availability Zones
Supported (non-zonal, zonal, or zone-redundant)
Not supported
Routing Preference
Supported
Not supported
Global Tier Support
Supported (via cross-region load balancers)
Not supported
Standard Load Balancer
Supported
Not supported
NAT Gateway Support
Supported
Not supported
Azure Firewall Support
Supported
Not supported
Basic vs Standard Load Balancer (Source: Azure Docs)
Feature
Standard Load Balancer
Basic Load Balancer
Scenario
High performance, ultra-low latency, zone-aware, cross-region
Small-scale apps, no zone support
Backend Pool Type
IP-based, NIC-based
NIC-based
Protocol Support
TCP, UDP
TCP, UDP
Health Probes
TCP, HTTP, HTTPS
TCP, HTTP
Availability Zones
Zone-redundant, zonal, non-zonal
Not available
Diagnostics
Azure Monitor multi-dimensional metrics
Not supported
HA Ports
Available
Not available
Secure by Default
Closed to inbound flows unless allowed via NSG
Open by default
Outbound Rules
Declarative outbound NAT configuration
Not available
TCP Reset on Idle
Available
Not available
Multiple Frontends
Inbound and outbound
Inbound only
Management Operations
Most < 30 seconds
60–90+ seconds
SLA
99.99%
Not available
Global VNet Peering Support
Supported
Not supported
NAT Gateway & Private Link
Supported
Not supported
In order for your VMs to continue to function after September 30th 2025 you need to update this.
How Do I Know
To see which VM has the old SKU of IP. Go to the Azure Portal and search to Public IP in the top search bar. You will get a list where all Public IPs are listed.
What do I need to do?
If the VM is OLD and only used for development, it is fairly easily replaced. Either you just reploy it from LCS (the new VM will have the correct IP SKU) or if you are up for it, you can deploy one of MS fancy new Universal Development Environments and then you will not have to worry about the infrastructure again.
If you need to keep the VM, there is an option to update the IP address, however, all of the VMs that I have needed to update were created from LCS and all of those have a loadbalancer that also needs to be updated.
Upgrading the LoadBalancer
Since all of the VMs deployed from LCS have a loadbalancer and if you have an older VM, with a legacy IP, chances are that you also have a Basic LoadBalancer. Since we cannot have a Standard IP connected to a Basic LoadBalancer, we need to update them both. Fortunately there is a nice script to do this.
Start a Cloud Shell (Powershell) in the Azure portal and import the module
When you run the update script for the LoadBalancer, it will also update the Public IP to a Standard SKU
Missing Backend Pool
On some of the VMs I have updates, I experienced that the LoadBalancer configuration was missing a Backend Pool and that the VM was not in that non existing pool.
To add the pool and the VM, go to the LoadBalancer, open Settings – Backend Pools, click add. Give the pool a name (vm-backend-pool) and add the VM to it by clicking Add
Updating the IP
To update the IP you will need to temporarily disassociate it from the VM and the reconnect it after the update. These are the steps to do it:
Associate the IP from th eresources it is connected to