Extended FnO fields in DataVerse

Last week I had an issue with DataEvents. We hav made a change to an entity in FnO and we wanted the new fields for be visible in the JSON message sent by Data Events. We did not get the correct information in the JSON message. This did not work… I tested the following tips

– Refresh the entity list
– Refresh the entity Mappings
– Reactivated the Data Event
– Rebuild  business event catalog (from the manage tab in Business Events Catalog)
– Did an DMF export to verify that the data is correct
– I checked the OData feed from FnO and the fields looked OK. Dataevents use Virtual Entities and Virtual Entities are based on OData

When I Look in Dataverse the added fields are not visible… strange… No wonder the message is not correct.

What you need to do to get this working is that you need to go to Advanced find in CRM/CE and find the entity in Available Finance and Operation Entities. Open it by clicking on it.

The next part is a little hidden… If you click the — besides Refresh, it will turn into a checkbox

Check it and then click Save

NOTE: When I did this the first few times it did not work. I am not positive why but I think it might be because did not wait for it to finish saving completely. Let it take its time… get a coffee (or other beverage of choice)

Once you have done this, go back to the maker portal and verify that the field is visible there

Thanks for the Tip, Nurlin

Links
How to refresh FinOps Virtual Entity in CDS ? – Dynamics 365 Finance Forum Community Forum

Set up the Azure Machine Learning Service for Demand Forecasting – Addendum

This article is sort or addendum to the Microsoft Learn articles for setting up Demand Forecasting with Azure Machine Learning. I got the request from one of my colleagues to set this up in out demo environment and as I went through the step-by-step guide I notices that some of the steps were unclear so I thought I would write down my own experiences:

1. To start I downloaded the setup files from GitHub and when I started setting them up I noticed some quirks. First of all you need to install Azure CLI and the correct ML extension. You get Azure CLI from here. To install the azure-cli-ml run the following command:

az extension add --name azure-cli-ml

Note that there are two different Azure ML extensions and the install script has a verification that checks that azure-cli-ml is installed… so the other one will not work (azure-cli-ml is the older version and ml is the newer version). The other thing to note is that the azure-cli-ml and ml is incompatible. If you have installed ml you need to uninstall it dy running:

az extension remove -n ml

I have not tested if you can just change the install script to use the new one instead.

2. The second issue I noticed was that there is a parameter hardcoded in the PowerShell Script which is also documented in GitHub. The solution is the the following line in quick_setup.ps1

$computeInstance_Name = "notebookScryptExecutor"

Change the variable value to something else. (Note that there is a maximum field length of 24).

3. The install script will create the following resources

– Azure AD App registration
– Azure ML Workspace
– Azure Storage Account with some containers

In order to do this, the correct permissions in Azure are needed. Otherwise, the script will fail.

Links
Set up the Azure Machine Learning Service
Demand forecasting setup
GitHub – microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning: Samples, templates and setup guides in order to run demand forecasting in Azure Machine Learning Service and integrate with Dynamics 365 SCM
Compute instance name hardcoded in quick_setup.ps1 · Issue #4 · microsoft/Templates-For-Dynamics-365-Supply-Chain-Management-Demand-Forecasting-With-Azure-Machine-Learning · GitHub

Running FnO on a Mac

Hi…

This week I got a question from one of my oldest friends an I got curious (as one does). The question was: “What are the limitations when running Dynamics 365 for Finance and Supply Chain on a Mac?”. Since I am not a Mac user I thought; I might as well document this for the future.

I reached out to a colleague and what I got was this:

Since Finance and Supply Chain it self is a web-bases applications the core application will work without any problems in (almost) the browser of you choice (the supported browsers are Microsoft Edge, Google Chrome and Apple Safari. There are however 3 areas that does not work that requires some workarounds.

  • Open in Excel – The Excel add-in is not available for the Mac version of office. You are able to export to Excel since this is just a pure download of a file that you then open, but the Open Excel requires an OData add-in for publishing the data back to FnO. The workaround is to use a virtualization. By using Parallells or another virtualization to run Windows on the Mac you are able to install the Windows version of office and run the Excel Addin.
  • Management Reporter Report Designer – Management Reporter Report Designer is a is a .NET application that is launched from the FnO web application and used for editing report that are then viewed in FnO. This also requires Windows to run and the virtualization workaround above is also applicable. The tool is however not used in day-to-day work in the system which makes a bit easier to live without.
  • Workflow edit – Just as the Report Designer above the Workflow editor is a tool used for building and administering the workflows in the system… not as much in the daily work. As with Report Designer it is also a .NET application and will work with the same workaround.

Depending on the technical level of your users this virtualization solution might be a working solution or you could with an old fashion terminal server/Citrix solution.

If you have any experiences around this that I have missed… please do not hesitate to reach out.

Links:
Install Windows 11 on a Mac with Apple M-series chip (parallels.com)
Options for using Windows 11 with Mac® computers with Apple® M1® and M2™ chips – Microsoft Support
System requirements for cloud deployments – Finance & Operations | Dynamics 365 | Microsoft Learn

Issues syncing Sales Order Lines with Dual Write

I am doing some experimenting with Synapse Links for Dataverse and to do that I need my FnO data in Dataverse. The way I do it, is using Dualwrite. When I try to sync Sales Order Lines V2 to salesorderdetails I get the following error:

Reason: Bad Request, Header x-ms-client-request-id f1004256-8a98-42fe-86a9-02b8e64a81a9, Produkten kan inte läggas till eftersom den inte är aktiv

(for those of you, not fluent in Swedish it says “The product cannot be added because it is not active)

This means that the product is synced but it is not active. To activate the product it needs to be published. To do this go through the following steps:

  1. In Sales open products
  2. Select a product and click Publish

The problem was I had a couple of thousand products. So I googled and found this forum thread helping me to write a workflow to automate it (this is way beyond my knowledge in CRM).

Links:
https://learn.microsoft.com/en-us/dynamics365/sales/publish-product-bundle-make-available-selling
Bulk publishing products – Dynamics 365 Sales Forum Community Forum

Data Events issues after refresh

In a previous article I wrote a bit about Data Events in Dynamics 365 for Finance and Operations. Data Events is a really simple way to create event-based integrations based on changes in the different data entities in FnO. The Data Events functionality is based on functionality in Power Platform. The setup requires the installation of the Finance and Operations Virtual Entity solution in the Power Platform environment connected to FnO. When you create a trigger Data Event it also created a virtual entity in DataVerse. This creates a couple of challenges when it comes to refreshing databases between environments.

For FnO
The settings for the endpoints created in FnO are partially stored in FnO and partially stored in an Azure Key Vault. The settings stored in FnO are amongst others Key Vault URL, App registration ID and Secret. To make sure that these settings do not get extracted from the environment (or accidentally moved to another environment) they are encrypted using an environment specific key and are thus not readable in the destination environment for the refresh. To restore the functionality of Data Events in the destination environment the endpoints need to be removed and recreated. After that has been done, we can re-activate the triggers.

Note that broken endpoints create an issue even if they are not being used. It seems like all endpoints are being validated when you try to create a new one which results in the creation failing.

For DataVerse/CRM
Since the functionality of Data Events is based on Virtual Entities created in DataVerse these will be overwritten when a refresh is done from one DataVerse environment to another. The error message you will get when the event is triggered is this:

Response status code does not indicate success: 404 ({“error”:{“code”:”0x80048d02″,”message”:”Virtual entity ‘mserp_vendvendorbankaccountentity’ not found for external entity VendVendorBankAccountEntity”}}).

(of course, with a different entity name based on your scenario)

The solution is to go to the Active Data Events tab in the Business Events workspace and remove and recreate each Data Event Trigger.

Note. You might have to wait a moment (1 minuter or so) before you recreate the trigger in order for everything to be properly cleaned in DataVerse.

Note: I have not been able to verify what happens if the identical triggers are set up in both source and destination environments. It might be that there are no issues or we might have the same issue. If anyone knows, please let me know 🙂

Lessons Learned: There has always been a lot to think about when you do refreshes… And DataVerse integration/DualWrite adds even more. There is a great article by Faisal Fareed here that detail steps that needs to be done for DualWrite integrated environments.

Links:
Not able to activate Data Events for entities – JohanPersson.nu
Microsoft Dynamics 365 and Power Platform Library: Steps to follow when refresh dual-write integrated environments (FO and CE) (daxture.blogspot.com)

Finding Electronic reporting files

I really like the fact that there is an entire reporting framework built in to Dynamics 365 for Finance and Operations.

Today I got a questions from a colleague about where the files generated by Electronic Reporting. Since I had not done this before I am documenting it here.

Go to Electronic Reporting Jobs for the correct legal entity

Select the job and click Show Files

Select the file you want and click Open

Links:
Electronic reporting (ER) destinations

AX 2012 Reporting Services Troubleshooting – Part 2

In the last post I was troubleshooting the SSRS connection from AX 2012. I was quite happy when the settings validated correctly but I was still not able to run the report… I got this error:

Error while setting server report parameters. Error message: The DefaultValue expression for the report parameter ‘AX_CompanyName’ contains an error: Request for the permission of type ‘System.Security.Permissions.EnvironmentPermission, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089’ failed. (rsRuntimeErrorInExpression)

After some googling researching I found someone having the same issue (link below).

Apparently you need to change the PermissionSetName in the rssrvpolicy.config file on the Reporting Server. Start Notepad as administrator and open the file from the folder C:\Program Files\Microsoft SQL Server\MSRS13.MSSQLSERVER\Reporting Services\ReportServer (you should probably back it up first).

Change this:
<CodeGroup class=”UnionCodeGroup” version=”1″ PermissionSetName=”Execution” Name=”Report_Expressions_Default_Permissions”

To this:
<CodeGroup class=”UnionCodeGroup” version=”1″ PermissionSetName=”FullTrust
Name=”Report_Expressions_Default_Permissions”

That is all for today

Links
SSRS Report Error – AX_CompanyName – DAXNigel (wordpress.com)

AX 2012 SSRS Troubleshooting

Today I got the following message… “Could you just fix the reports on this AX 2012 environment… It is broken”

Then I tried to validate the report configuration I got the following error:

The SQL Server Reporting Services server name [servername] does not exist or the Web service URL is not valid.

OK… First test:

Can I browse to the report URLs in System Administration – Business Intelligence – Reporting Services – Report Servers.

No I could not, from the AX Server. it worked from the SQL Server so I checked the firewall on the SSRS Server and added port 80.

I still get the same issue when validating. So I opened the Microsoft Dynamics AX 2012 Management Shell as an administrator and ran Test-AXReportServerConfiguration

What… I opened the port and I can browse to the URLs. Apparently the as scripts tests for Remote Admin Ports so when I open those ports using this netsh command

netsh.exe firewall set service type=REMOTEADMIN mode=ENABLE scope=ALL

Presto!! The validation was successful!!


Links
Configure a Report Server for Remote Administration

Working with XSD files in AX 2012

Today I got a question from one of my colleagues… The needed to sent a complete AIF port specification to an external service provider and the XSD (XML Schema Description) the could find only contained the internal AX Types and it specifically lacked the information about maximum length for the fields.

Since I am not a developer and I am not that knowledgeable in the AOT I reached out to a couple of out developers. Thet is when I learned about Shared Types Schema . This contains the complete list of types and it also acts as sort of a translation table for the regular XSD and which is merged the exploring the service.

Not to self (once I forget this): The reason for having all the extended types in the original XSD is because we are able to change the field setup in AX and having it propagate to the entire system.

To get to the XSD for the AIF port you go to the port (System Administration – Setup – Services and Application Integration Framework). Select the port, verify that Customize Document is checked and clock Data Policies

Click on View Schema

Here you can see the XSD for the port. If you click Imported Schema you will open up the Shared Types Schema. Save both the regular Schema and the Shared Schema and send it to whoever needs it

Links
About the Shared Types Schema

DualWrite syncing empty lines on Initial Sync

I had this strange issue today…

One of my customers have set up DualWrite in their DEV environment and after some tweaking it worked OK. With that done they wanted to move the entire solution into TEST and verify that it worked. We packaged all the mappings into a Solution, exported it from DEV and moved it into TEST. We started enabling the Mappings for Legal Entity and a when we looked in CRM/CE we had a bunch of empty lines. We had the same number of lines but all of them were empty.

If we changed the data on one of the Legal Entities it synced over just fine but all the rest were still empty.

When we looked into the DMT files for the data project for the initial sync the files looked “good” to me… they contained data

|NAME|,|LEGALENTITYID|
|LegalEntity1|,|AAA|
|LegalEntity2|,|AAB|
|LegalEntity3|,|AAC|
|LegalEntity4|,|AAE|
|LegalEntity5|,|AAF|
|LegalEntity6|,|AAG|
|LegalEntity7|,|AAH|
|LegalEntity8|,|AAI|
|LegalEntity9|,|AAJ|
|LegalEntity10|,|AAK|
|LegalEntity11|,|AAL|
|Company accounts data|,|dat|

Some of you might already see the issue 🙂 (don’t spoil the surprise) When contacting Support they told me then there is a problem with the text qualifier in the file… the strings should be enclosed in ” instead of |

It turns out that someone had changed the default format CSV-Unicode in Data Management Framework to this:

I changed it back to this:

After cleaning out the records from CRM/CE and rerunning initial sync everything works again…