Extended FnO fields in DataVerse

Last week I had an issue with DataEvents. We hav made a change to an entity in FnO and we wanted the new fields for be visible in the JSON message sent by Data Events. We did not get the correct information in the JSON message. This did not work… I tested the following tips

– Refresh the entity list
– Refresh the entity Mappings
– Reactivated the Data Event
– Rebuild  business event catalog (from the manage tab in Business Events Catalog)
– Did an DMF export to verify that the data is correct
– I checked the OData feed from FnO and the fields looked OK. Dataevents use Virtual Entities and Virtual Entities are based on OData

When I Look in Dataverse the added fields are not visible… strange… No wonder the message is not correct.

What you need to do to get this working is that you need to go to Advanced find in CRM/CE and find the entity in Available Finance and Operation Entities. Open it by clicking on it.

The next part is a little hidden… If you click the — besides Refresh, it will turn into a checkbox

Check it and then click Save

NOTE: When I did this the first few times it did not work. I am not positive why but I think it might be because did not wait for it to finish saving completely. Let it take its time… get a coffee (or other beverage of choice)

Once you have done this, go back to the maker portal and verify that the field is visible there

Thanks for the Tip, Nurlin

Links
How to refresh FinOps Virtual Entity in CDS ? – Dynamics 365 Finance Forum Community Forum

Set up the Azure Machine Learning Service for Demand Forecasting – Addendum

This article is sort or addendum to the Microsoft Learn articles for setting up Demand Forecasting with Azure Machine Learning. I got the request from one of my colleagues to set this up in out demo environment and as I went through the step-by-step guide I notices that some of the steps were unclear so I thought I would write down my own experiences:

1. To start I downloaded the setup files from GitHub and when I started setting them up I noticed some quirks. First of all you need to install Azure CLI and the correct ML extension. You get Azure CLI from here. To install the azure-cli-ml run the following command:

az extension add --name azure-cli-ml

Note that there are two different Azure ML extensions and the install script has a verification that checks that azure-cli-ml is installed… so the other one will not work (azure-cli-ml is the older version and ml is the newer version). The other thing to note is that the azure-cli-ml and ml is incompatible. If you have installed ml you need to uninstall it dy running:

az extension remove -n ml

I have not tested if you can just change the install script to use the new one instead.

2. The second issue I noticed was that there is a parameter hardcoded in the PowerShell Script which is also documented in GitHub. The solution is the the following line in quick_setup.ps1

$computeInstance_Name = "notebookScryptExecutor"

Change the variable value to something else. (Note that there is a maximum field length of 24).

3. The install script will create the following resources

– Azure AD App registration
– Azure ML Workspace
– Azure Storage Account with some containers

In order to do this, the correct permissions in Azure are needed. Otherwise, the script will fail.

Links
Set up the Azure Machine Learning Service
Demand forecasting setup
GitHub – microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning: Samples, templates and setup guides in order to run demand forecasting in Azure Machine Learning Service and integrate with Dynamics 365 SCM
Compute instance name hardcoded in quick_setup.ps1 · Issue #4 · microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning · GitHub

Running FnO on a Mac

Hi…

This week I got a question from one of my oldest friends an I got curious (as one does). The question was: “What are the limitations when running Dynamics 365 for Finance and Supply Chain on a Mac?”. Since I am not a Mac user I thought; I might as well document this for the future.

I reached out to a colleague and what I got was this:

Since Finance and Supply Chain it self is a web-bases applications the core application will work without any problems in (almost) the browser of you choice (the supported browsers are Microsoft Edge, Google Chrome and Apple Safari. There are however 3 areas that does not work that requires some workarounds.

  • Open in Excel – The Excel add-in is not available for the Mac version of office. You are able to export to Excel since this is just a pure download of a file that you then open, but the Open Excel requires an OData add-in for publishing the data back to FnO. The workaround is to use a virtualization. By using Parallells or another virtualization to run Windows on the Mac you are able to install the Windows version of office and run the Excel Addin.
  • Management Reporter Report Designer – Management Reporter Report Designer is a is a .NET application that is launched from the FnO web application and used for editing report that are then viewed in FnO. This also requires Windows to run and the virtualization workaround above is also applicable. The tool is however not used in day-to-day work in the system which makes a bit easier to live without.
  • Workflow edit – Just as the Report Designer above the Workflow editor is a tool used for building and administering the workflows in the system… not as much in the daily work. As with Report Designer it is also a .NET application and will work with the same workaround.

Depending on the technical level of your users this virtualization solution might be a working solution or you could with an old fashion terminal server/Citrix solution.

If you have any experiences around this that I have missed… please do not hesitate to reach out.

Links:
Install Windows 11 on a Mac with Apple M-series chip (parallels.com)
Options for using Windows 11 with Mac® computers with Apple® M1® and M2™ chips – Microsoft Support
System requirements for cloud deployments – Finance & Operations | Dynamics 365 | Microsoft Learn

Data Events issues after refresh

In a previous article I wrote a bit about Data Events in Dynamics 365 for Finance and Operations. Data Events is a really simple way to create event-based integrations based on changes in the different data entities in FnO. The Data Events functionality is based on functionality in Power Platform. The setup requires the installation of the Finance and Operations Virtual Entity solution in the Power Platform environment connected to FnO. When you create a trigger Data Event it also created a virtual entity in DataVerse. This creates a couple of challenges when it comes to refreshing databases between environments.

For FnO
The settings for the endpoints created in FnO are partially stored in FnO and partially stored in an Azure Key Vault. The settings stored in FnO are amongst others Key Vault URL, App registration ID and Secret. To make sure that these settings do not get extracted from the environment (or accidentally moved to another environment) they are encrypted using an environment specific key and are thus not readable in the destination environment for the refresh. To restore the functionality of Data Events in the destination environment the endpoints need to be removed and recreated. After that has been done, we can re-activate the triggers.

Note that broken endpoints create an issue even if they are not being used. It seems like all endpoints are being validated when you try to create a new one which results in the creation failing.

For DataVerse/CRM
Since the functionality of Data Events is based on Virtual Entities created in DataVerse these will be overwritten when a refresh is done from one DataVerse environment to another. The error message you will get when the event is triggered is this:

Response status code does not indicate success: 404 ({“error”:{“code”:”0x80048d02″,”message”:”Virtual entity ‘mserp_vendvendorbankaccountentity’ not found for external entity VendVendorBankAccountEntity”}}).

(of course, with a different entity name based on your scenario)

The solution is to go to the Active Data Events tab in the Business Events workspace and remove and recreate each Data Event Trigger.

Note. You might have to wait a moment (1 minuter or so) before you recreate the trigger in order for everything to be properly cleaned in DataVerse.

Note: I have not been able to verify what happens if the identical triggers are set up in both source and destination environments. It might be that there are no issues or we might have the same issue. If anyone knows, please let me know 🙂

Lessons Learned: There has always been a lot to think about when you do refreshes… And DataVerse integration/DualWrite adds even more. There is a great article by Faisal Fareed here that detail steps that needs to be done for DualWrite integrated environments.

Links:
Not able to activate Data Events for entities – JohanPersson.nu
Microsoft Dynamics 365 and Power Platform Library: Steps to follow when refresh dual-write integrated environments (FO and CE) (daxture.blogspot.com)

Finding Electronic reporting files

I really like the fact that there is an entire reporting framework built in to Dynamics 365 for Finance and Operations.

Today I got a questions from a colleague about where the files generated by Electronic Reporting. Since I had not done this before I am documenting it here.

Go to Electronic Reporting Jobs for the correct legal entity

Select the job and click Show Files

Select the file you want and click Open

Links:
Electronic reporting (ER) destinations

DualWrite syncing empty lines on Initial Sync

I had this strange issue today…

One of my customers have set up DualWrite in their DEV environment and after some tweaking it worked OK. With that done they wanted to move the entire solution into TEST and verify that it worked. We packaged all the mappings into a Solution, exported it from DEV and moved it into TEST. We started enabling the Mappings for Legal Entity and a when we looked in CRM/CE we had a bunch of empty lines. We had the same number of lines but all of them were empty.

If we changed the data on one of the Legal Entities it synced over just fine but all the rest were still empty.

When we looked into the DMT files for the data project for the initial sync the files looked “good” to me… they contained data

|NAME|,|LEGALENTITYID|
|LegalEntity1|,|AAA|
|LegalEntity2|,|AAB|
|LegalEntity3|,|AAC|
|LegalEntity4|,|AAE|
|LegalEntity5|,|AAF|
|LegalEntity6|,|AAG|
|LegalEntity7|,|AAH|
|LegalEntity8|,|AAI|
|LegalEntity9|,|AAJ|
|LegalEntity10|,|AAK|
|LegalEntity11|,|AAL|
|Company accounts data|,|dat|

Some of you might already see the issue 🙂 (don’t spoil the surprise) When contacting Support they told me then there is a problem with the text qualifier in the file… the strings should be enclosed in ” instead of |

It turns out that someone had changed the default format CSV-Unicode in Data Management Framework to this:

I changed it back to this:

After cleaning out the records from CRM/CE and rerunning initial sync everything works again…

Figuring out DataVerse and DualWrite in Dynamics 365 FnO

This is a (probably the first) post to try to sort out my experiences around setting up a DataVerse connection to a Finance and Operations environment and figuring out how this interacts with Power Platform Admin Center, LCS and DualWrite.

Background

DualWrite is Microsofts go-to solution for integrating Dynamics 365 CE and Dynamics 365 FnO. It uses Dataverse and PowerPlatform extensively which means that we are, in all essences, merging two separate products into one, which creates some challenges.

Since Microsoft is in the middle of a “Convergence” transition when it comes to managing these things I realize that this is a moving target at the moment, which is why I will need to come back to this this eventually.

This article will address some of the challenges that we have experienced, setting up DualWrite.

Since my primary focus is FnO I will start there.

There are a lot of clues to the fact that Microsoft sees PowerPlatform and DataVerse as an integral part of Dynamics 365 for Finance and Operations. The first one you will notice is that you get the option to create a Power Platform environment when you create a new FnO environment. Another lead is that none of the microservice add-ins that you can deploy from LCS are available to deploy it you have not connected your environment to Power Platform.

There are two different ways to create the DualWrite setup. Setting up a new, empty Environment when deploying a new FnO environment or linking your FnO environment to an existing CRM/CE environment. Please remember that, if you have an existing CRM environment with existing customizations (of a highly customized FnO environment) you should probably think about setting up a proof of concept to evaluate how to handle customizations. Keep in mind that the out-of-box mappings for DualWrite are created for vanilla environments.

Initial Setup

When setting up a new Finance and Operations environment you get the option of also setting up a new connected DataVerse environment. You will not get the option to connect an existing environment. You are able to opt out of this setup at the time of deployment if you want.

Regardless of what you choose the environment will be created and connected from the Power Platform side. On the LCS side there is no indication of any DataVerse environment.

Connecting to PowerPlatform

NOTE: This decision is IRREVERSIBLE. Once you have linked your FnO environment to a Power Platform environment there is no supported way to unlink it.

Once the environment is set up LCS offers an option to set up the DataVerse Connection. You can use the one provisioned for you, if you are not using CRM or if you are not planning to use DualWrite to interface with CRM, or you can link it to your existing Dynamics 365 for Sales (CRM) environment. Even though the connection is done to your existing/live CRM environment the operation should be safe since the Power Platform are being deployed to another “partition” of the environment. I know, the message in the upper right corner looks a bit scarry…

This operation only enabled the install of add-ins, DualWrite still needs to be set up from within FnO when you are ready for it.

Lessons Learned

Since Microsoft is currently moving the management experience of Dynamics 365 for Finance and Operations environments to the Power Platform Admin Center, all of this is a changing scenario and I think what we are seeing is a transition to what is about to come.

Key Take-Aways

  • Do a gradual rollout, starting with some entities
  • If there is data that does not need to be synchronized, a different solution such as virtual entities or PowerApps could be an idea
  • Do a proof-of-concept to validate the setup

Links:
Enable integration during environment deployment
Microsoft Power Platform integration with Finance and Operations apps – Finance & Operations | Dynamics 365 | Microsoft Docs

Enable vs Enabled Features

I have learned that it is important to read the fine print… otherwise you will miss things. This happened to me and a colleague the other day.

In Dynamics 365 for Finance and Operations, Microsoft add new features all the time. These are controlled in Feature Management workspace where you can also read up on the new features to understand important impact.

A cool feature is that there is a Data Entity in the Data Management Framework which exports and imports the feature set for a given environment and enables moving feature settings from one environment to another making easier to manage the lifecycle of your features and sync them with for instance releases

Now comes the part that I completely missed (which in hindsight is quite obvious):

There is a column called Enable Date, which I thought meant “The date the feature was enabled”, what it actually means “The date the feature is enabled”… notice the subtle difference?? I did not 🙁

What the column actually does is to set a schedule for when the feature will be enable. When you use DMF to import a list of feature settings with this field set means that you will schedule the enablement of the feature. It is a great feature but might cause some issues if you are not aware of it. Especially for features that can not be turned off.

Remember: Read and understand the fine print

What happened to batch groups??

I need to confess something… Sometime I do not read all of the release notes as thoroughly as I maybe should. This was made clear to me the other day when I tried to set up a batch job and make sure it was executing in a new Dynamics 365 for Finance and Operations Production environment. Or maybe I read it, but did not really understand it…

Before, you were able to create a batch group and add servers to it. This was used in for instance AX 2012 or D365FO to control execution and divide resources in an optimal way between batch jobs. Especially when you were dealing with large, time consuming, batches that you needed to ensure ran correctly while not starving smaller (time critical) jobs of resources

Starting with version 10.0.29 the default behaviour is that all Batch Servers are assigned to all Batch Groups. Each Batch Group has a scheduling priority set to either Low, Normal, High, Critical or Reserved Capacity and the batch jobs are then assigned to the Batch Group meaning that you will (almost) always have load balancing over all servers. The exception to the rule is if you choose Reserved Capacity.

Reserved Capacity means that you on an environment level can exclude a percentage of the total batch capacity (aka Batch Threads). The setup is done in System Parameters – Batch global settings. The default setting is “No reserved capacity”, meaning that all batch servers threads are available to the load-balancing. You are able to change this to Low, Medium or High (10, 15 or 25 percent) which will then exclude batch capacity from the pool. Worth knowing is that when no batch jobs with Reserve Capacity are executing the reserved batch threads will be sitting idle.


Links
https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/sysadmin/priority-based-batch-scheduling