Understanding Electronic Reporting Dependencies

I have for a while now meaning to write a blog post about dependencies in Electronic Reporting in Dynamics 365 for Finance and Supply Chain… and now I finally got a reason since I had a customer that needed to move ER configuration to a Cloud hosted Environment. In Tier2+ environments, you can do this by using the RCS repository but that is not possible for cloud hosted environments.

So I want to move ISO20022 Credit transfer (IE) to another environment.

In order to that we need to move all upstream dependencies. As you can see in the image there are different levels in the ER config. To move Level 3 we first need to create level one and two…

Otherwise we will get a Unresolved references left – Reference of the object ‘Object Name’ to the object ‘Model’ (GUID,version) cannot be established

Unfortunately I have not found the GUID useful here since I am not able to see the GUID anywhere in the UI (maybe someone who understands better can explain), but the version (61) gives us a hint. If we look at the ISO20022 Credit transfer (IE), we can see that it has a dependency to ISO20022 Credit transfer version 43.61

and if we look at ISO20022 Credit transfer version 43.61, it has a dependency to Payment model version 43

This means that we start with exporting and moving Payment model version 43.

Then we import it in the new environment

Then we do the same with ISO20022 Credit transfer version 43.61 and when that is done we repeat for ISO20022 Credit transfer (IE) version 43.61.13

Normally I do this by using the RCS repository but that is not possible for cloud hosted environments

Required batch job not running

One of my customers pinged me a while ago and mentioned that she had notifications in her Dynamics 365 Finance and Supply Chain environment saying that she was missing some batch jobs.

  • Scan for orphaned document references
  • Scan for document files that have been scheduled for physical deletion
  • Scans for temporary files that have expired
  • Deletes expired email history.
  • Recommendation batch job

All jobs had to do with optimizing how the solution works and are all part of Microsoft Best Practice… that means that if you are contacting support and have a performance related issue, this is one of the tasks you might get.

First of all, we need to take a step back and think about why these jobs are needed. Since FnO is a cloud hosted “SaaS” solution, we have access to a limited amount of performance and one of the reasons for running optimization and clean-up jobs in the solution is to keep it performant. We also pay for storage capacity, or at least we will once the new environments based on the Power Platform architecture is in full swing and keeping the database and storage lean means that we are paying less.

Back to the issue at hand… There are 5 batch jobs the system is saying are best practice. To enable these, you can eighter just go to the Batch Jobs in your system and set them to waiting, or you can just click the link in the notification to enable them. In my case there was actually one job missing from batch jobs and in that case the job was recreated and set to waiting.


Besides the batch jobs I would recommend you to set up the clean-up jobs documented on Microsoft Learn and you should also read up on the new features coming for archiving and long term Retention.

That is all for today

Extended FnO fields in DataVerse

Last week I had an issue with DataEvents. We hav made a change to an entity in FnO and we wanted the new fields for be visible in the JSON message sent by Data Events. We did not get the correct information in the JSON message. This did not work… I tested the following tips

– Refresh the entity list
– Refresh the entity Mappings
– Reactivated the Data Event
– Rebuild  business event catalog (from the manage tab in Business Events Catalog)
– Did an DMF export to verify that the data is correct
– I checked the OData feed from FnO and the fields looked OK. Dataevents use Virtual Entities and Virtual Entities are based on OData

When I Look in Dataverse the added fields are not visible… strange… No wonder the message is not correct.

What you need to do to get this working is that you need to go to Advanced find in CRM/CE and find the entity in Available Finance and Operation Entities. Open it by clicking on it.

The next part is a little hidden… If you click the — besides Refresh, it will turn into a checkbox

Check it and then click Save

NOTE: When I did this the first few times it did not work. I am not positive why but I think it might be because did not wait for it to finish saving completely. Let it take its time… get a coffee (or other beverage of choice)

Once you have done this, go back to the maker portal and verify that the field is visible there

Thanks for the Tip, Nurlin

Links
How to refresh FinOps Virtual Entity in CDS ? – Dynamics 365 Finance Forum Community Forum

Set up the Azure Machine Learning Service for Demand Forecasting – Addendum

This article is sort or addendum to the Microsoft Learn articles for setting up Demand Forecasting with Azure Machine Learning. I got the request from one of my colleagues to set this up in out demo environment and as I went through the step-by-step guide I notices that some of the steps were unclear so I thought I would write down my own experiences:

1. To start I downloaded the setup files from GitHub and when I started setting them up I noticed some quirks. First of all you need to install Azure CLI and the correct ML extension. You get Azure CLI from here. To install the azure-cli-ml run the following command:

az extension add --name azure-cli-ml

Note that there are two different Azure ML extensions and the install script has a verification that checks that azure-cli-ml is installed… so the other one will not work (azure-cli-ml is the older version and ml is the newer version). The other thing to note is that the azure-cli-ml and ml is incompatible. If you have installed ml you need to uninstall it dy running:

az extension remove -n ml

I have not tested if you can just change the install script to use the new one instead.

2. The second issue I noticed was that there is a parameter hardcoded in the PowerShell Script which is also documented in GitHub. The solution is the the following line in quick_setup.ps1

$computeInstance_Name = "notebookScryptExecutor"

Change the variable value to something else. (Note that there is a maximum field length of 24).

3. The install script will create the following resources

– Azure AD App registration
– Azure ML Workspace
– Azure Storage Account with some containers

In order to do this, the correct permissions in Azure are needed. Otherwise, the script will fail.

Links
Set up the Azure Machine Learning Service
Demand forecasting setup
GitHub – microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning: Samples, templates and setup guides in order to run demand forecasting in Azure Machine Learning Service and integrate with Dynamics 365 SCM
Compute instance name hardcoded in quick_setup.ps1 · Issue #4 · microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning · GitHub

Running FnO on a Mac

Hi…

This week I got a question from one of my oldest friends an I got curious (as one does). The question was: “What are the limitations when running Dynamics 365 for Finance and Supply Chain on a Mac?”. Since I am not a Mac user I thought; I might as well document this for the future.

I reached out to a colleague and what I got was this:

Since Finance and Supply Chain it self is a web-bases applications the core application will work without any problems in (almost) the browser of you choice (the supported browsers are Microsoft Edge, Google Chrome and Apple Safari. There are however 3 areas that does not work that requires some workarounds.

  • Open in Excel – The Excel add-in is not available for the Mac version of office. You are able to export to Excel since this is just a pure download of a file that you then open, but the Open Excel requires an OData add-in for publishing the data back to FnO. The workaround is to use a virtualization. By using Parallells or another virtualization to run Windows on the Mac you are able to install the Windows version of office and run the Excel Addin.
  • Management Reporter Report Designer – Management Reporter Report Designer is a is a .NET application that is launched from the FnO web application and used for editing report that are then viewed in FnO. This also requires Windows to run and the virtualization workaround above is also applicable. The tool is however not used in day-to-day work in the system which makes a bit easier to live without.
  • Workflow edit – Just as the Report Designer above the Workflow editor is a tool used for building and administering the workflows in the system… not as much in the daily work. As with Report Designer it is also a .NET application and will work with the same workaround.

Depending on the technical level of your users this virtualization solution might be a working solution or you could with an old fashion terminal server/Citrix solution.

If you have any experiences around this that I have missed… please do not hesitate to reach out.

Links:
Install Windows 11 on a Mac with Apple M-series chip (parallels.com)
Options for using Windows 11 with Mac® computers with Apple® M1® and M2™ chips – Microsoft Support
System requirements for cloud deployments – Finance & Operations | Dynamics 365 | Microsoft Learn

Data Events issues after refresh

In a previous article I wrote a bit about Data Events in Dynamics 365 for Finance and Operations. Data Events is a really simple way to create event-based integrations based on changes in the different data entities in FnO. The Data Events functionality is based on functionality in Power Platform. The setup requires the installation of the Finance and Operations Virtual Entity solution in the Power Platform environment connected to FnO. When you create a trigger Data Event it also created a virtual entity in DataVerse. This creates a couple of challenges when it comes to refreshing databases between environments.

For FnO
The settings for the endpoints created in FnO are partially stored in FnO and partially stored in an Azure Key Vault. The settings stored in FnO are amongst others Key Vault URL, App registration ID and Secret. To make sure that these settings do not get extracted from the environment (or accidentally moved to another environment) they are encrypted using an environment specific key and are thus not readable in the destination environment for the refresh. To restore the functionality of Data Events in the destination environment the endpoints need to be removed and recreated. After that has been done, we can re-activate the triggers.

Note that broken endpoints create an issue even if they are not being used. It seems like all endpoints are being validated when you try to create a new one which results in the creation failing.

For DataVerse/CRM
Since the functionality of Data Events is based on Virtual Entities created in DataVerse these will be overwritten when a refresh is done from one DataVerse environment to another. The error message you will get when the event is triggered is this:

Response status code does not indicate success: 404 ({“error”:{“code”:”0x80048d02″,”message”:”Virtual entity ‘mserp_vendvendorbankaccountentity’ not found for external entity VendVendorBankAccountEntity”}}).

(of course, with a different entity name based on your scenario)

The solution is to go to the Active Data Events tab in the Business Events workspace and remove and recreate each Data Event Trigger.

Note. You might have to wait a moment (1 minuter or so) before you recreate the trigger in order for everything to be properly cleaned in DataVerse.

Note: I have not been able to verify what happens if the identical triggers are set up in both source and destination environments. It might be that there are no issues or we might have the same issue. If anyone knows, please let me know 🙂

Lessons Learned: There has always been a lot to think about when you do refreshes… And DataVerse integration/DualWrite adds even more. There is a great article by Faisal Fareed here that detail steps that needs to be done for DualWrite integrated environments.

Links:
Not able to activate Data Events for entities – JohanPersson.nu
Microsoft Dynamics 365 and Power Platform Library: Steps to follow when refresh dual-write integrated environments (FO and CE) (daxture.blogspot.com)

Finding Electronic reporting files

I really like the fact that there is an entire reporting framework built in to Dynamics 365 for Finance and Operations.

Today I got a questions from a colleague about where the files generated by Electronic Reporting. Since I had not done this before I am documenting it here.

Go to Electronic Reporting Jobs for the correct legal entity

Select the job and click Show Files

Select the file you want and click Open

Links:
Electronic reporting (ER) destinations

DualWrite syncing empty lines on Initial Sync

I had this strange issue today…

One of my customers have set up DualWrite in their DEV environment and after some tweaking it worked OK. With that done they wanted to move the entire solution into TEST and verify that it worked. We packaged all the mappings into a Solution, exported it from DEV and moved it into TEST. We started enabling the Mappings for Legal Entity and a when we looked in CRM/CE we had a bunch of empty lines. We had the same number of lines but all of them were empty.

If we changed the data on one of the Legal Entities it synced over just fine but all the rest were still empty.

When we looked into the DMT files for the data project for the initial sync the files looked “good” to me… they contained data

|NAME|,|LEGALENTITYID|
|LegalEntity1|,|AAA|
|LegalEntity2|,|AAB|
|LegalEntity3|,|AAC|
|LegalEntity4|,|AAE|
|LegalEntity5|,|AAF|
|LegalEntity6|,|AAG|
|LegalEntity7|,|AAH|
|LegalEntity8|,|AAI|
|LegalEntity9|,|AAJ|
|LegalEntity10|,|AAK|
|LegalEntity11|,|AAL|
|Company accounts data|,|dat|

Some of you might already see the issue 🙂 (don’t spoil the surprise) When contacting Support they told me then there is a problem with the text qualifier in the file… the strings should be enclosed in ” instead of |

It turns out that someone had changed the default format CSV-Unicode in Data Management Framework to this:

I changed it back to this:

After cleaning out the records from CRM/CE and rerunning initial sync everything works again…

Figuring out DataVerse and DualWrite in Dynamics 365 FnO

This is a (probably the first) post to try to sort out my experiences around setting up a DataVerse connection to a Finance and Operations environment and figuring out how this interacts with Power Platform Admin Center, LCS and DualWrite.

Background

DualWrite is Microsofts go-to solution for integrating Dynamics 365 CE and Dynamics 365 FnO. It uses Dataverse and PowerPlatform extensively which means that we are, in all essences, merging two separate products into one, which creates some challenges.

Since Microsoft is in the middle of a “Convergence” transition when it comes to managing these things I realize that this is a moving target at the moment, which is why I will need to come back to this this eventually.

This article will address some of the challenges that we have experienced, setting up DualWrite.

Since my primary focus is FnO I will start there.

There are a lot of clues to the fact that Microsoft sees PowerPlatform and DataVerse as an integral part of Dynamics 365 for Finance and Operations. The first one you will notice is that you get the option to create a Power Platform environment when you create a new FnO environment. Another lead is that none of the microservice add-ins that you can deploy from LCS are available to deploy it you have not connected your environment to Power Platform.

There are two different ways to create the DualWrite setup. Setting up a new, empty Environment when deploying a new FnO environment or linking your FnO environment to an existing CRM/CE environment. Please remember that, if you have an existing CRM environment with existing customizations (of a highly customized FnO environment) you should probably think about setting up a proof of concept to evaluate how to handle customizations. Keep in mind that the out-of-box mappings for DualWrite are created for vanilla environments.

Initial Setup

When setting up a new Finance and Operations environment you get the option of also setting up a new connected DataVerse environment. You will not get the option to connect an existing environment. You are able to opt out of this setup at the time of deployment if you want.

Regardless of what you choose the environment will be created and connected from the Power Platform side. On the LCS side there is no indication of any DataVerse environment.

Connecting to PowerPlatform

NOTE: This decision is IRREVERSIBLE. Once you have linked your FnO environment to a Power Platform environment there is no supported way to unlink it.

Once the environment is set up LCS offers an option to set up the DataVerse Connection. You can use the one provisioned for you, if you are not using CRM or if you are not planning to use DualWrite to interface with CRM, or you can link it to your existing Dynamics 365 for Sales (CRM) environment. Even though the connection is done to your existing/live CRM environment the operation should be safe since the Power Platform are being deployed to another “partition” of the environment. I know, the message in the upper right corner looks a bit scarry…

This operation only enabled the install of add-ins, DualWrite still needs to be set up from within FnO when you are ready for it.

Lessons Learned

Since Microsoft is currently moving the management experience of Dynamics 365 for Finance and Operations environments to the Power Platform Admin Center, all of this is a changing scenario and I think what we are seeing is a transition to what is about to come.

Key Take-Aways

  • Do a gradual rollout, starting with some entities
  • If there is data that does not need to be synchronized, a different solution such as virtual entities or PowerApps could be an idea
  • Do a proof-of-concept to validate the setup

Links:
Enable integration during environment deployment
Microsoft Power Platform integration with Finance and Operations apps – Finance & Operations | Dynamics 365 | Microsoft Docs