• What is a Released Distinct Product

    When you map products between Dyanmics 365 for Finance and Supply Chain and Dynamics 365 for Sales there are a couple of basic concepts that are important to think about.

    Products

    Product in FnO contains master data. It is also a global table, available in all Legal Entities. To sell a “product” we first need to release it in a Legal Entity. Dualwrite syncs these to the table msdyn_globalproducts.

    Released Products

    A Released Products, also known as an Item is a product that is released to a Legal Entity. In D365FO, Microsoft has created a custom entity that is used for syncing Released Products to CRM, called DV Released products. In the Dualwrite mapping, it is syncing to the table called msdyn_sharedproductdetails.

    Released Distinct Products

    As you notice, none of the dataverse tables above, are native CRM tables for products (the hint here is the prefix msdyn_). Du to the difference in data structure between ERP and CRM, Microsoft created and additional Data Entity in FnO that we can sync more easily to CRM. DV Released distinct products is basically a copy of Released Products, with some added metadata such as Configuration, Color and Size.

    Sometimes when you set up Dualwrite, lines in the databases does not sync correctly in the Initial Sync and then you need to help them along a bit. In order to “touch” (edit them to force a sync) Products and Released Products in FnO you simply go to the UI for these in Product information management. There is no “workspace” for Distinct Released Products.

    In order to trigger a sync of a Released Distinct Product to Products, you need to touch one of these fields:

    The easiest is to go to Products (not the Released Product) in FnO and edit the Description Field. That will trigger a sync.

  • Dualwrite: Missing order lines

    When setting up Dualwrite for orders and quotes and doing initial sync, sometimes all order/quote lines do not sync over correctly. Especially if you have historical orders that are already delivered.

    If an order is delivered i FO and the head of the quote/order is synced to CE, the lines will not sync. The result is that whenever you try to do something to that order (for instance invoice it), there is logic in CE, that sees that is Delivered and thus sets it to read-only. Depending on if this happens during initial sync or during Live sync, there are different methods to fix this.

    Initial Sync

    While you are initial syncing quotes/orders you can use a workaround in order to correctly sync lines to an invoices/delivered order/quote.

    The trick is to manipulate the sync mapping for the order/quote header and set all headers, temporarily, to open. NOTE: this is only temporary in order to get the lines to sync correctly.

    1. Open the DualWrite workspace and go the mapping for Dynamics 365 Sales order headers (salesorders).

    2. If it is started, stop the mapping.

    3. Find the SALESORDERPROCESSINGSTATUS line and change delivered and invoiced to 192350000 (this will set them to active in CRM).

    4. Click Save as to save the mapping. Give it a good description so you don´t use it by mistake.

    5. Perform the initial sync for Dynamics 365 Sales order headers (salesorders).

    6. Sync Dynamics 365 Sales order lines (salesorderdetails)

    7. Set the mapping for Dynamics 365 Sales order headers (salesorders) back to the correct mapping and perform a new initial sync.

    8. Start the mappings for Dynamics 365 Sales order headers (salesorders) and Dynamics 365 Sales order lines (salesorderdetails) without syncing it again.

    Live Sync

    When you are already live with the solution and you find an order that is missing lines, the issue is that you can not run initial sync, because that would destroy the data.

    To figure out if this is your issue, open the order in FnO and CE and compare them. If the order is missing lines or the entire order is missing from CE, go through these steps:

    1. Open the order in FnO
    2. Select the first line in the order and click Update line – Deliver remainder


    3. Set the Sales Quantity and Inventory Quantity to 1 and click OK


    4. The status of the order is not set to Open Order and the order should exist in CE with the line you made the change for.


    5. If there are more lines on the order in FnO which have not synced to CE, make a small edit to the line (add a . to the Text field). This will force it to sync.
    6. When the order is correct in CE, go back to the first line in FnO and click Update Line – Deliver remainder. Click Cancel quantity.

    The order is now back to the status Delivered.

  • Dual Write core application error-SecureConfig Organization (XXXX) does not match actual Dataverse Organization (YYYY)

    We are working with a a couple of customers that are running Dualwrite to sync between ERP and CRM.

    Today one of my colleagues called me and told me he could not create an account in an environment at one of these customers… he got a Dualwrite error 😮

    Dual Write core application error-SecureConfig Organization (XXXX) does not match actual Dataverse Organization (YYYY)

    (the names have been changed to protect the innocent)

    The weird thing is that this particular environment is not even connected with Dualwrite. It does not even have a D365FO app installed. It has however been refreshed from an environment that has Dualwrite active.

    The best practice, when refreshing environments with Dualwrite is that you reset the Dualwrite configuration and activate all of the mappings again. The reset is done from within the FnO Dualwrite configuration screen:

    Since this environment does not have a FnO app, there is no way to reset the configuration. We need to do this the hard way 🙂

    To fix this, do the following:

    1. Browse to https://make.powerapps.com and select the correct environment

    2. Go to Tables, view all tables and find the table called Dual Write Runtime Configuration. This table contains the Dualwrite configuration. Usually when you are resetting the Dualwrite connection after a refresh this table is emptied.

    3. Open the table, select all the rows and delete them.

    Links:

    F&O 💙 Power Platform – Database Refresh – Carina M. Claesson

    Dual Write – Organization does not match actual Dataverse Organization

  • App Insights for FnO in 15 Minutes

    When LCS is being deprecated eventually, we will loose our main source of out of the box monitoring and telemetry for Dynamics 365 for Finance and Supply Chain. Fortunately Microsoft has a plan and D365Fo is now able to connect and sent telemetry to Azure Application insights.

    Application Insight is Microsoft’s service for Telemetry and Insights. It receives data, stores it in a database and you run queries against it and build dashboards to see health of your environment. Most Azure Services can also sent telemetry to Application Insights, this means that you will have a "single pane of glass" where you can see D365FO and Integration Services in the same timeline.

    When I talk to my customers, most of them think that is a hard thing to set up… in this post I will show the steps and do it in 15 minutes:

    1. Create a new App Insights Instance

      Pasted image 20250514090510.png
      Wait for it to finish

    2. Set retention of data. If you are doing the setup for test, dev or lab it is a great idea to limit the amount of data that is saved (since that is what drives cost)

      1. Go to the Analytics Workspace
        BlogPost - Appinsights in 15 Minutes.png
      2. Click Usage and Estimated Cost
        BlogPost - Appinsights in 15 Minutes-1.png
      3. Set a Daily Cap to limit the amount of data collected per day
        BlogPost - Appinsights in 15 Minutes-2.png
      4. Set the time data will be retained
        BlogPost - Appinsights in 15 Minutes-3.png
    3. In FnO, activate the Feature "Monitoring and Telemetry"
      Pasted image 20250514090844.png
    4. In Fno Go to Monitoring and Telemetry in System Administration
    5. Activate all the checkboxes
      Pasted image 20250514091013.png
    6. In Environments, create your environemtn

      1. ID in LCS or in PPAC
      2. Set Mode
      3. Save
    7. In Application Insight Registry

      1. Enter instrumentation Key from Azure
      2. Enter the Connection String
    8. Verify functionality

      1. Click around in FnO
      2. In the Log Analytics Workspace, click Logs and dismiss the pop-up
      3. Click Page Views and choose run
        BlogPost - Appinsights in 15 Minutes-4.png
      4. Now you have data 🙂
        BlogPost - Appinsights in 15 Minutes-5.png
        This was a really quick setup (it actually took 15 minutes, including the initial draft of the blog post) of App Insights and as you might imagine now we are at the point where the real work starts. Making sense of your data is the real job.

    Microsoft Fasttrack has released a bunch of ready-made reports and dashboards that you can use to get started. These are made for Azure Data Analytics which I will set up in a later blogpost. There are also a whole bunch of new feature coming that we will also look into in later posts.

    Good Luck getting started with Application Insights

  • Using Open Source Software in your Dynamics Implementation

    The idea of this article started for a couple of reasons: The first thing that happended was that Alex Meyer released his D365FO Admin Toolkit on GitHub. The second thing that happened was that I read the brilliant article Scary dangerous creepy tools by Jonas Rapp) and these two things made me think about the Benefits and Challenges of using Open Source in Dynamics 365 for Finance and Supply Chain. (Since then there have been others such as Jonas Feddersen Melgaard´s D365FinanceToolbox)

    First of all I will dive into some background… About 10 years ago, I left the “IT Infrastructure” world and ventured into the Dynamics world. On the infrastructure side, Open Source Software is a big thing. The majority of web servers on the internet, runs on Linux, a lot of our Internet appliances (such as firewalls and routers) runs on Linux, more than half the worlds cell phones has a base in Open Source and a lot of components such as Curl, OpenSSH and Mermaid are used by millions of business users every day, since they are all baked into commercial products. In fact, 60% of all compute cores in Microsoft Azure runs some version of Linux.

    So the question is; why is it OK to use Open Source Software, built by the community, everywhere else but not in Dynamics… “Because it is our super-duper critical ERP System of course!!!!”. Well I would argue that there are more important systems in your organization (not to belittle your ERP system) that are using at least a couple of Open Source components. That means that the actual issue is not the code itself… it is something else. In this article we will try to understand the blockers and why the might not be relevant.

    Mentioning the fact that a lots of commercial closed source software uses Open Source components, I agree, is a bit dis-honest. However it makes a good point and I think it puts a finger on the real issue here: Responsibility. When a company embeds a 3rd party component in the software, they assume responsibility for it. They agree to patch it (which is not always done) and they agree to take responsibility for the end product towards the customer. We will come back the question of responsibility later in this text.

    The Benefits

    There are a couple of benefits of using software, made by someone else, in our solution. The most obvious one is the same as the argument for ISV software. We do not need to start building something from scratch and that means that we can free up time for our project and for our developers to do other things.

    There is also the the quality argument. If someone has built it and many other uses it, the risk of it being broken is smaller than if everyone build their own solution to a common problem. Working iteratively on a common software base over time will hopefully make it less likely to break. Another argument here will be “But we are solving our own, unique issues” and that might be so… but not all of your challenges are unique. If you have an issue in Dynamics, then it might be that others have the same issue and and that they need to solve it too.

    There is also the question of missing features in the product. When Microsoft add new features to Dynamics, they need to prioritize them and if not enough organizations wants a feature, then it will not bubble to the top in the priority list and thus it will not be built. Having community developed features will help with bridging the gap and will also act as an indicator to Microsoft of what the customers want in the product. On the consumer side of tech, there is a concept called Sherlocking. That is when (In this case) Apple implements a function, that a 3rd party software developer has built, into iOS or MacOS.

    The Challenges

    Earlier I compared Open Source Software to ISV solutions and to be honest there is one big difference… responsibility. When you buy an ISV solution from a vendor or you let a partner build and implement your solution for you there is always someone else that assumes responsibility for the code written. But as for all contract there are always disclaimers… this applies even to to the license agreement of Dynamics. There are some thing, not covered by the agreement. The responsibility always lands on the end customer in the end and you as an end customer need to be ready to assume that responsibility. If we assume that the responsibility falls on the customer in the end, there is basically no difference between Microsoft-, ISV-, Partner-, Open Source- or Customer created code. We still need to test it and make sure that it is maintained and updated. The main difference is that we (as an organization) can not affect Microsoft’s, the ISVs or (in some cases) the Partners Code. We are however able to make changes to the Open Source Solution.

    I know that there are ISVs out there that supply the source code for their solution… Is that the same as using Open Source? Well, not exactly, but sort of… There are up and down sides of this. If we buy the solution from the vendor and we have a support agreement, we should try to stay away from editing the code; it blurs the lines of responsibilities. With that being said there is still a benefit in that we are able to speed up the troubleshooting process, because we are able to debug, and help provide the solution to the vendor. The real benefit of getting access to the codebase for an ISV solution is however if something happens the vendor and they go out of business. In that case we can choose to continue to support the solution ourselves.

    The Commitments

    As we have seen in this article, there are some thing we need to think about when we start using Open Source Software. As always we need to make sure the software holds up to the same level of quality that we need, we need to keep it updated (and with that comes of course testing and code review, in the same way as with our own code), but I also think that there is another level of commitment here and that is that if we find a bug in the code, we should also be a “good citizen” of the community and at lease report a bug (maybe even with a proposed solution) or even write a fix and submit a pull-request back to the project to get the fix into the code base… if it benefits us, it will probably benefit others.

    “So that means that we should use our precious time writing code for others?” Yes, with the time we save having others write code and test it for us, we absolutely should pitch in and help. I am absolutely convinced that we will spend less time in the long run while at the same time helping others do the same.

    The Conclusions

    Using 3rd Party Tools, always needs to be a deliberate choice and going with an Open Source Solution comes with its own challenges, but we also need to understand that building all of our own customizations from scratch, means that we will be using a “one-off” solution that not always adhere to best practices. In that case we are on our own, but when we use a solution that is built by the community, at least, we can figure out the solution together.

  • Troubleshooting Dualwrite Microsoft.Dynamics.Integrator.Exceptions.IntegratorClientException

    Currently I am in the middle of installing Project Operations for a customer. In order to provide data to Project Operations we need to use Dualwrite to move data from Dynamics 365 for Finance and Supply Chain into Dataverse, which Project Operations uses as its database.

    Yesterday I found a weird Dualwrite issue. In order to sync Customers, we also need to sync the entity CDS Contacts V2 (contacts). I started the initial sync… after running for around 6 (!!) hours it failed with the following error.

    Type=Microsoft.Dynamics.Integrator.Exceptions.IntegratorClientException, Msg=Type=Microsoft.Dynamics.Integrator.Exceptions.IntegratorClientException, Msg=FinOps export encountered an error.(Type=Microsoft.Dynamics.Integrator.Exceptions.IntegratorClientException, Msg=Export failed, Please check execution for project DWM-d91cec93-bcf1-4a8e-a7fa6b0615e195c45fb9bb52d8690665b9c_0 and execution id ExportPackage-9/2/2025 7:23:39 AM-a41d84f0-e619-4a43-83db-d4f6a4855b97 in FnO. Error details Type=Microsoft.Dynamics.Integrator.Exceptions.IntegratorClientException, Msg=F&O export encountered an error. Please check project and execution ExportPackage-9/2/2025 7:23:39 AM-a41d84f0-e619-4a43-83db-d4f6a4855b97 in F&O)
    

    I updated mappings and refreshed the entity list (as you do) and reran it, with the same issue.

    Initial sync uses the Data Management Framework (DMF) to move the data to Dataverse so I thought I should look at the execution logs, for the DMF project. Then I filtered the list, it did not exist (!?!?!?!)

    The next step, I tried to manually export CDS Contacts V2 (contacts) from DMF. I finally got a useful error!!

    That lead me to go to the Entity List in DMF… there I found something strange:

    Normally the status for the entity should be enabled… it was not. I then went to the License page in FnO

    Turns out the customer has disabled a lot (!) of configuration keys and one of them is CDS Integrations… After entering Maintenence Mode and Enabling the key, the entities were still disabled. To see the correct status, you need to do a Entity List Refresh from DMF – Framework Parameters

    After that the sync went through just fine

    Todays learnings is around Configuration Keys… Do not disable them if you are not able to oversee the full consequence of doing so

  • Dualwrite – Beware of the reverse

    I am setting up Dualwrite at a customer and I got an issue the other day. The customer wants to be able to create customers from Dynamics 365 CE and syncing them to Dynamics 365 for FO. We had done all of the initial syncs and done multiple test for creating Accounts in CE and the synced perfectly to FO.

    When I was trying to figure out a way to manage Financial Dimension population while creating accounts I tried creating the customer directly in FO, just to test a thing… and it failed!!

    I tried again from CE and it worked… but not from FO. We had been so focused on testing one direction but not the other… Doh!

    So, what was the issue? I got this error:

    Unable to write data to entity accounts.Writes to CustCustomerV3Entity failed with error message Request failed with status code BadRequest and CDS error code : 0x80048d19 response message: Error identified in Payload provided by the user for Entity :'accounts', For more information on this error please follow this help link https://go.microsoft.com/fwlink/?linkid=2195293 ----> InnerException : Microsoft.OData.ODataException: Cannot convert the literal '' to the expected type 'Edm.Int32'. ---> System.FormatException: Input string was not in a correct format.
    

    and then it continued with a stack trace… Hmmm…

    So I started brainstorming with a colleague: It is obviously a data type missmatch and when I turn off the mapping for Account, it works. Going through the mapping we had added a few transform mappings so we started there. It turns out that all of these we 1 to 1 mappings. The problems was that there three fields were not on the initial “customer creation sidebar”. In CE these were made mandatory but in FO, they were not.

    I took a look at the mappings for these 3 fields and one stood out. It had no mapping for the empty value.

    This meant that when going from FO to CE I tried to convert and empty string to an integer and since there was no transform for the empty field and no default it could not be written to CE.

    The easiest way to add an empty mapping to null is to edit the JSON version of the mapping (I did not know this was possible until a short while ago)

    [
    	{
    		"transformType": "ValueMap",
    		"valueMap": {
    			"Dealer": "787980000",
    			"End user": "787980001",
    			"Nat OEM": "787980002",
    			"Reg OEM": "787980003",
    			"Int OEM": "787980004",
    			"Integrator": "787980005",
    			"Contractor": "787980006",
    			"Other": "787980007",
    			"": null
    		}
    	}
    ]
    
    
    
    
    
    

    Add the last line and do not forget the comma at the end of the previous line

    That was it… I saved it and restarted the mapping. We verified it… Worked!!!

    That was it for today… see you around

  • AADSTS50011: The redirect URI ‘https://D365FOenv.operations.eu.dynamics.com/’ specified in the request does not match the redirect URIs

    Last week I needed to set up a new Dynamics 365 for Finance and Supply Chain environment and I for a strange error message which took some time to figure out.

    AADSTS50011: The redirect URI 'https://enadvdemo01.operations.eu.dynamics.com/' specified in the request does not match the redirect URIs configured for the application '00000015-0000-0000-c000-000000000000'. Make sure the redirect URI sent in the request matches one added to your application in the Azure portal. Navigate to https://aka.ms/redirectUriMismatchError to learn more about how to fix this.

    (since I am not an EntraID expert I might be some details wrong in the explanation but this is what I think the issue is)

    The issue here is that when you are working with D365FO, which is a Microsoft Saas-ish service, there is a Service Principal created for Microsofts application in your Entra ID tenant. When you set up a new environment, the URL for that environment is added to that Service Principal as two ReplyUrls. One for the base URL and one for the OAuth endpoint.

    Apparently there is a limit (255) for how many of these URLs you can have for the Service Principal. This means that when you have deployed enough environments the property fills up. I am guessing that there might be a clean-up routine for these but that it might sometimes fail.

    The solution is to remove a couple of old ones and manually add the new ones.

    1. Log into the Azure Portal

    2. Start the Cloud Shell

    3. In the Cloud Shell, run the following commands

    connect-azuread
    
    $AADRealm = "00000015-0000-0000-c000-000000000000"
    Get-AzureADServicePrincipal -Filter "AppId eq '$AADRealm'"

    Find old, retired URLs here and run the following

    $EnvironmentUrl = "https://newenv.operations.eu.dynamics.com"
    
    $OLDEnvironmentUrl = "https://retired env.operations.eu.dynamics.com"
    
    $SP = Get-AzureADServicePrincipal -Filter "AppId eq '$AADRealm'"
    
    $SP.ReplyUrls.Remove("$OLDEnvironmentUrl")
    $SP.ReplyUrls.Remove("$OLDEnvironmentUrl/oauth")
    
    $SP.ReplyUrls.Add("$EnvironmentUrl")
    $SP.ReplyUrls.Add("$EnvironmentUrl/oauth")
    
    Set-AzureADServicePrincipal -ObjectId $SP.ObjectId -ReplyUrls $SP.ReplyUrls

    This will remove the retired URLs and add the ones for the new environment

    Links:
    Error AADSTS50011 – The reply URL specified in the request does not match the reply URLs configured for the application <GUID>. | Microsoft Learn
    Solved: AADSTS50011: The reply URL specified in the request does not match the reply URLs configured for the application: ‘00000015-0000-0000-c000-000000000000’.

  • Presentations from D365UG Sweden – Fall 2024

    Last week, on the 23rd of October I had the pleasure to speak in three sessions at the Dynamics 365 User Group Sweden in Stockholm at the Micrsoft Office.

    I had a great day, even if it started of with a presentation mishap which I managed to correct in time together with one of my fantastic colleagues. The lasting result was only an overdose of adrenalin 🙂


    Here are the links to the presentations: D365UG Fall 2024

  • Unable to find my ER Data Package

    A wee k or so ago, one of my colleagues contacted me from one of our customers. She was setting up VAT reporting for Germany and in the instructions it says that in order to set it up you will need to import a data package from the LCS Shared Asset Library. When she goes to the Shared Asset Library in LCS it is completely empty.


    It turns out that this customer is in the EU LCS tenant and Microsoft has not populated the Shared Asset Library in EU. To get to the files there is a fairlysimple solution:

    1. Before you log into LCS, make sure you are in the United States region


    2. Log in to LCS. Note: You will not see your project in the list of projects. It will be empty. Don’t worry, it is by design.
    3. Go to the Shared Asset Library – Data Packages. Find the package you are looking for, click it to download it.
    4. Log out from LCS, switch region and log in again.
    5. If you want it in you projects Asset Library, go to the Asset Library an Upload the package.

      That is all for today, good luck 🙂