DualWrite Filtering

As I have mentioned before I am currently involved in implementing DualWrite with a customer and in this case there is data in CRM and FnO that we need to work around.

One example is a “single” contact that is used in multiple companies as an EDI adress. The reason I wrote “single” is that since we in FnO use the Global Adress Book, the contact is one Party that is instancuated as a contact in each of the Customers that are using it. This means that if you look in All Contacts there are a whole bunch of Duplicates.

These contacts are then synced, using DualWrite, to CRM… much to the annoyance of my CRM colleagues. This contact is of no use in CRM and I was asked to create a filter for them.

What i did was to set the field “Profession” in FnO (this was not being used) to EDI end then I added the following to the filters for Customer Contacts in DualWrite

FnO: (AssociatedContactType = 0) && (EMPLOYMENTPROFESSION != “EDI”)
CRM: msdyn_contactfor eq ‘Customer’ and msdyn_sellable eq false and msdyn_contactpersonid ne null and jobtitle ne ‘EDI’

As you notice, the syntax for the FnO and CRM sides is very different and that was the main reason I wrote this post. I needed a place to document the syntax 🙂

Links

Filter Not Working in FO side for Dual Write (dynamics.com)

Being a BizApps MVP


About two years ago, I got that amazing email in my mailbox that started with “We’re pleased to present you with the Microsoft Most Valuable Professional (MVP) award,” which at the time (and still) was a bit surreal.

I have now had the title for two years, and I am writing this because I have just been traveling to my second MVP Global (in-person) Summit. This was an excellent time to reflect on what it means to be a BizApps MVP.

Microsoft MVP is a title you can get awarded for contributing to the community and helping others understand Microsoft products. You are awarded in one (or rare cases two, or even fewer cases three!!) award categories. My area is Business Applications since my main focus is on Dynamics 365 for Finance and Supply Chain; others are Azure, Development, Windows, and others. With the award comes access to multiple channels for Microsoft, such as email distribution lists and Teams channels, where we can ask questions to the different teams at Microsoft and our fellow MVPs.

Why I want to be an MVP

There are, of course, multiple reasons why I love being an MVP. Some of them are related to my work, and some of them are being part of a great community.

The Community

I have always been a huge fan of the “Community” and believe that together, we can be better and help each other. Being an MVP is a community title. Having a voice in the community is a great honor, and channeling the community feedback I get into the product groups and the product managers at Microsoft makes it all worth it. I truly believe that a good feedback loop between Microsoft and its customers benefits all. Being able to guide Microsoft to build a better product, making it more flexible where it makes sense, and helping them understand the customer challenges are important for customers and the product teams.

Meeting with the community at conferences and talking to end users and IT makes it possible to understand how the products are used. It also means that I get an opportunity to spread the best practices set in place by Microsoft.

In short, being an MVP means getting a voice and listening. You will listen to the users and consultants working in your area and be able to speak to (and be listened to by) Microsoft. It also works the other way; even though part of what we learn from Microsoft is under strict NDA, we can use it to understand and explain the direction Microsoft is taking the product we work with. Being a conduit between Microsoft and its customers is an important part of our work.

At work

As I mentioned above, the MVP title is not actually awarded for something you do in your line of work; in fact, paid work is not part of what is awarded. That does not mean that there is no connection to my work. When I talk to customers about challenges they are experiencing in projects with the product or when they need guidance on best practices, having the backing of the product groups at Microsoft and all the other fantastic MVPs is a great support.

This, however, is a two-way street. When my customers have challenges with implementing “their” Business Processes in Dynamics, being able to put the process in perspective together with Microsoft and give MS feedback around why this is not an optimal solution or helping them understand potential bugs also ensures that the product gets better for all Dynamics Customers.

Access to Product Groups

This brings us to the PGIs, or Product Group Interactions, another important part of the role. Microsoft Product Groups arrange regular meetings about the roadmap of new features; discussing design decisions, licensing, and what is coming down the line is valuable for understanding what is coming and for relaying issues, licensing mismatches, and general feedback, which is invaluable.

What is the Global MVP Summit?

As mentioned above, I just got home from the MVP Global Summit. One of my greatest experiences as an MVP was traveling to the Microsoft Campus in Seattle to experience three days of community, insight, and knowledge.

As if that was not enough, I also got to meet a lot of MVPs. Talking to everyone, not only with the ones in my group, and understanding how they leverage their knowledge and tools to help customers all day is an inspiration. It means I will have more perspectives on our everyday challenges.

Key takeaways

Apart from the community and being able to meet Product Teams and the other MVPs, I guess it comes as no surprise that this year’s MVP Summit had many sessions about Copilot, and with that many discussions on Data Governance, Ethical AI, and the best part according to me, the benefits of Copilot.

As for the sessions around Dynamics and Dataverse, there were many discussions on what is coming around Application Lifecycle Management and Security on Power Platform, which I look forward to digging into. On the Dynamics Finance and Supply Chain side, the highlights for me were a chance to look into the roadmap around the new environment strategy and the next chapter in the “One Dynamics, One Platform” story and, of course, Copilot.

In summary

Meeting many new people, reconnecting with friends, and experiencing the “MVP Summit Bubble” once again is a great honor. If it were possible to slow down time and forever stay in this, I would definitely do it. But all of this will remain in my heart and mind forever… I hope to be back once again for this experience.

Fetching App

Today I set up Synapse Link to a “Bring Your Own DataLake” for a customer. When I went through the configuration in SynapseLink and I had configured my D365FO Entities the process hung at the status Message “Fetching App”

After I looked around a bit I found the solution on Yammer… The issue was that the Service Principal for Common Data Service – Azure Data Lake Storage was missing from Microsoft Entra ID.

The solution:

New-AzADServicePrincipal -ApplicationId 546068c3-99b1-4890-8e93-c8aeadcfe56a

This command needs to be run by someone who is an Azure Admin on the tenant level

Note: While researching this, I also saw that the Service Principal for Power Query Online (AppId: f3b07414-6bf4-46e6-b63f-56941f3f4128) also might be missing. In that case… Run this: New-AzADServicePrincipal -ApplicationId f3b07414-6bf4-46e6-b63f-56941f3f4128

Good Luck

Unable to find my ER Data Package

A wee k or so ago, one of my colleagues contacted me from one of our customers. She was setting up VAT reporting for Germany and in the instructions it says that in order to set it up you will need to import a data package from the LCS Shared Asset Library. When she goes to the Shared Asset Library in LCS it is completely empty.


It turns out that this customer is in the EU LCS tenant and Microsoft has not populated the Shared Asset Library in EU. To get to the files there is a fairlysimple solution:

  1. Before you log into LCS, make sure you are in the United States region


  2. Log in to LCS. Note: You will not see your project in the list of projects. It will be empty. Don’t worry, it is by design.
  3. Go to the Shared Asset Library – Data Packages. Find the package you are looking for, click it to download it.
  4. Log out from LCS, switch region and log in again.
  5. If you want it in you projects Asset Library, go to the Asset Library an Upload the package.

    That is all for today, good luck 🙂

Forcing Initial Sync in DualWrite

I am currently working on a project where we are implementing DualWrite in existing Dynamics 365 Environments. Since we do not have huge data volumes we decided to use Initial Sync in DualWrite to migrate some data from D365 Finance and Supply Chain over to D365 for Sales.

When we verified the data we noticed and issue with addresses. It turns out that the customers had all addresses set to purpose Business and as it turns out the addresses that are configured in the default mapping for DualWrite are Delivery and Invoice addresses. Once the customer fixed the addresses, I thought: “Lets just resync the Customer V3 entity using Initial Sync!”. Well… turns out is was not that easy.

When the sync har run it synced 3 customers instead of around 1800…

When I looked at the Data Management Project (after ALOT of troubleshooting) which Initial Sync generates I could see that the project was set to “Incremental Push Only”… I wanted it to do a complete sync… why is this happening?? What do I do know? There are 3 different ways that “might” solve this.

  1. Disable and Enable Change Tracking

    Your first option is to turn, change tracking off and then on again. You do this in the Data Management workspace, in Data Entities.
  2. Reset DualWrite in FnO

    In the DualWrite section of the Data Management workspace, click Reset Link. This will let you setup the link again; as well as purging all of the settings, it will also reset historical configuration. Since we already synced most of the data, we did not have to run initial sync for most of the entities… we only re-ran the Customers V3 entity


  3. Deleting the DMF Project

    Every initial sync created a DMF project and when run the Initial sync for the second time, Dataverse tried to be smart and reuse the DMF project which meant. Deleting the project meant that it had to be created again.
    (Thanks Nathan Clouse for this insight)

    Links:
    https://learn.microsoft.com/dynamics365/fin-ops-core/dev-itpro/data-entities/dual-write/dual-write-troubleshooting-initial-sync?WT.mc_id=DX-MVP-5004702#error-customer-map

Whitelisting IPs for FnO Dev Environments

I got a question today from a customer… “Could you show me how to add IPs to the whitelist for our FnO Dev Mashines?”.. Here Goes

  1. Log in to the Azure Portal
  2. Find the Azure VM that you would like to change
  3. Go to Network Settings and locate the rdp-rule


  4. Open the rule and add your IP adress to the “Source IP addresses/CIDR ranges” field. You you have more than one IP, add a comma between the IPs.


  5. Click Save

Protecting your Dev VMs are important for a couple of reasons… The most important being that there are search engines on the internet that indexes RDP endpoints available to the Internet and if your VM is in that database, bad guys will start to try to break into them… and even if they might not succeed (LCS generates fairly good credentials) it will trigger a policy that will make the VM unavailable for logins for a while which, if nothing else, will stop your developers from doing their job.

Handling internal Vendors in DualWrite

At the moment I am involved in a DualWrite implementation between FnO and CE. The goal is eventually to be able to generate Quotes from CE and have them sync to FnO. As you might know there are a lot of entities required to get to the point where we can sync Quotes and one of the is Vendors V2 and another is Released Procucts. In order to sync Release Products we first need to have Vendors.

At this customer they are buying a lot of their products from an internal vendor (aka another Legal Entity of the same FnO instance. When we first synced Vendors, everything worked perfectly, with 100% completion, (as far as we could see) but when we tried to synd Released Products we were missing Vendors

Quite a lot… So digging into this we found that these Vendor were never synced, which we found a bit strange (remember 100 % completion.

Turns out there is a filter in DualWrite that looks like this and apparently internal Vendors are not of the type Organizations… They are LegalEntity

So with some modifications… it looks like this

But there is another thing we need to fix… We need to add the following line in the tranform rule

Once that is done, I force another Initial Sync of Vendors and once that is completed I could successfully sync Products… Yay !!

Unable to import users in Cloud Hosted Environment

At one of my customers I just set up a couple of new Cloud Hosted Environments (version 10.0.37 which turns out to be important) and when I tried to import the users from EntraID/AzureAD I got the following error

Cannot Find Thumbprint by Certificatename

After some troubleshooting och looking through Yammer I saw others that had the same issue. Apparently the issue had started happen after 15:th November (which also turned out to be important).

It turns out that Microsoft had discovered a potential security issue in the template used for creating the Cloud Hosted Environments. There used to be a connection in every Cloud Hosted Environment that allowed it to make lookups toi Azure AD/EntraID to be able to import users. For security reasons, this connection is no longer there by default. You will still be able to manually add users, but if you want to import users you will need to create the connection in the Virtual Machine.

1. Create a new App Registration in EntraID

2. In the Cloud Hosted VM run the following PowerShell Snippet (in an elevated Powershell prompt, aka Run as Administrator) to create a new Certificate.

New-SelfSignedCertificate -CertStoreLocation Cert:\LocalMachine\My -DnsName "CHECert" -KeyExportPolicy Exportable -HashAlgorithm sha256 -KeyLength 2048 -KeySpec Signature -Provider "Microsoft Enhanced RSA and AES Cryptographic Provider" -NotBefore (Get-Date -Year 2020 -Month 5 -Day 1) -NotAfter (Get-Date -Year 2033 -Month 12 -Day 31)

3. Start “Manage Computer Certificates” and find your newly created Cert. It should be in Local Computer – Personal – Certificates and it should be called “CHECert”. Export the certificate with default settings (Right-Click – All Tasks – Export) and save it in a folder you remember.

4. Go back to the App Registration you created in Step one, Go to Certificates and Secrets. Under Certificated, click upload certificate and choose you exported certificate

5. You need to add an Redirect URI to the AppRegistration. Go to Authentication, click Add a platform – Web and past the URL for the Cloud Hosted Dynamics environment

6. Add the following permissions to API Permissions and then click Grand admin concent…

7. In the Cloud Hosted VM, go back to “Manage Computer Certificates” and Right-Click (the Certificate) – All Tasks – Manage Private Keys. Give NETWORK SERVICE permissions to use the Certificate

8. In the Cloud Hosted VM, start Notepad as Admin and edit K:\AOS service\Webroot\web.config file. Edit the following keys:

<add key="Aad.Realm" value="spn:[TheAppIDfromStep1]" />
<add key="Infrastructure.S2SCertThumbprint" value="[YTheThumbPrintfromStep2]" />
<add key="GraphApi.GraphAPIServicePrincipalCert" value="[YTheThumbPrintfromStep2]" />

9. In the Cloud Hosted VM, start an elevated Command Prompt and run and iisreset

Validate by trying to import users

Links
Secure one-box development environments

Cannot access form Sales charge codes

I had an issue today at a customer… We were not able to open the Charge code form in one of our environments.

When we tried to open the form we also got a couple more error messages. tThe first saying that we could not read Retail Headquarter Parameters which lead us to try that form and we got an error which looks like: Parameter record does not exist.

Turns out that this was a bug introduced in 10.0.37 and which will be fixed in 10.0.38 related to the feature called Enable proper tax calculation for returns with partial quantity. When this feature is enabled the system is not able to create a line in the parameter table for Retail Headquarters due to a default value is not allowed.

The workaround is to disable the feature temporarily, initiate the creation of Retail parameters in the affected companies and then re-enable the feature.

Good luck

Links
Details for issue 849710 (dynamics.com)

Issues with DBsync step during deploy


Today, when I was deploying customization package to a newly deployed config environment, I had an issue with a step not working correctly. The environment had not yet been used for anything so I hadn´t even copied a database to it. When I deployed the customization package to it I got the following error in the runbook log and the deploy failed:

Table Sync Failed for Table: SQLDICTIONARY. Exception: System.NotSupportedException: TableID not yet generated for table: AmcBankReconciliations

The sync step in the runbook is failing because there is no TableID for the table AmcBankReconciliations. And I thought that was exactly what the sync process was supposed to do (??).

Having no clue about why this happened I first turned to Google (as one does) and when I could not find anything there I asked my awesome colleagues and one of the said:

“I have seen newly deployed environments behaving strangely and my solution usually is to start Visual Studio and perform a DB Sync”

This was a bit strange since it was the Sync Step that failed but I thought I would give it a try. Since this was a config environment that is not going to use Visual Studio, I instead opted for using the amazing [d365fo.tools](GitHub – d365collaborative/d365fo.tools: Tools used for Dynamics 365 Finance and Operations) to do the sync

Invoke-D365DBSync -Verbose

When the sync had finished I tried resuming the deploy and to my surprise it finished perfectly… Nice 🙂