Article: DynamicsCon Fall 2021 Preview: Why microservices matter to the future of Dynamics 365 for Finance and Operations

Microsoft Dynamics 365 for Finance and Operations (D365FO) is Microsoft’s latest iteration of their flagship ERP system that used to be called Dynamics AX (and before that Axapta). Over all those years of software evolution, the ongoing goal has been to build a modern ERP system that is easier to support, maintain, and innovate on with new types of application development.

With D365FO, Microsoft has introduced many important changes that modernize the ERP experience and make it more attractive to customers. Key differences from the AX era include a new model for development, a One-Version policy for periodic updates, and an evolving approach to adding features to the ERP system.

For those of you who are already using D365FO, you have probably noticed feature management, where new features appear after updating to the latest version. Most new features represent new or improved capabilities that are based on what was already in the product, but sometimes a large new feature appears that provides a significant new tool or something that has been totally rebuilt from scratch.

When Microsoft create these large changes, they are probably thinking (I do not work for Microsoft so I cannot be sure) “Should this really become part of the core product?”. There are some historic examples of this type of decision making from the product team: The retail components in Dynamics AX 2012, integrations in AX moved to an external integration platform, and using SSRS instead of a built-in reporting engine.

Today, Microsoft are continuing to move traditional features out of D365FO and deploy modernized versions of them as cloud services that are accessible from the core ERP with APIs. The first one that became available was resource planning. The built-in MRP functionality has been deprecated and completely removed from the core product and it is instead available as a plugin to install from LCS. Another example is the configuration of exports to Data Lake (which will replace the Entity Store). In fact, Data Lake and Dataverse are an integral part in other microservice integrations.

A lot of these services are in preview now and will soon be available. A few other examples include Finance Insights, Tax Calculation, Electronic Invoicing, and Expense Management. I suspect we will be seeing a lot more of these in the future.

There are many reasons to why this change is happening right now. When Microsoft “moved” AX to the cloud, “AX7” was mostly Dynamics AX 2012 R3 with a web interface. In the five or so years since then, we have seen a lot of change. The entire architecture has changed from a traditional on-prem software stack with a database and a Win32 frontend to a containerized solution with near zero-downtime maintenance. In order to improve the product even more, reduce the complexity, increase performance, and improve resilience to developing customizations, Microsoft are building a more modularized solution that scales better and does not affect the speed of the core product. We see this clearly in MRP where a resource planning job that used to take multiple hours now finishes in 15 minutes.

Another upside of this design is that these services can be updated without having to “respect” customizations made by customers and partners outside of Microsoft’s control. With just a lightweight interface for all parties to respect, Microsoft is free to iterate even faster than the monthly cadence of One-Version.

We are also seeing customers and ISV vendors using the same methodology when building code… “If it doesn’t need to be in the ERP system, we should probably move it out”. If you look in Microsoft Azure, you will also notice that there is a whole slew of services available for building these modular solutions.

If you want to learn more about these services, why they exist, how to use them and whether you can build your own; please register for DynamicsCon and see my session called “Why Micro Services in Dynamics 365 for Finance and Operations”. I will be there for the session and of course to answer all your questions.

The article was originally posted in MSDynamicsWorld on Aug 20:th 2021

Build Pipelines with D365FO version 10.0.21

Hi all

Yesterday Microsoft released the PEAP version of 10.0.21 and on the new VM images they have decided to update a whole lot of stuff (Yay!!).

The new VMs are based on Windows Server 2019, SQL Server 2019 and Visual Server 2019 which means that your existing pipelines in Azure Devops will not work any more if you do not do some minor changes. I basically made some adjustments to this article by Joris de Gruyter and added a final step to mitigate a SQL server issue

  • In the Build Solution change MSBuild Version to MSBuild 16.0
  • In the Database Sync step change MSBuild Verson to MSBuild 16.0
  • In the Execute Tests step change Test Platform version till Visual Studio 2019
  • In the script “C:\DynamicsSDK\DeploySSRSReports.ps1” on line 127, change
    Restart-Service -Name “ReportServer” -Force
    to
    Restart-Service -Name “SQLServerReportingServices” -Force
That´s it... the SQL issue will most certanly be fixed in the released version of the VMs        

Links
Updating The Legacy Pipeline for Visual Studio 2017 (codecrib.com)

Fixing: The model version ‘3.6’ is not supported

When doing a refresh today of one of our D365FO Dev environments I encountered an issue that puzzled me a bit. I ran the following script to import the BacPac file:

cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin

SqlPackage.exe /a:export /ssn:<server>.database.windows.net /sdn:MyNewCopy /tf:D:\Exportedbacpac\my.bacpac /p:CommandTimeout=1200 /p:VerifyFullTextDocumentTypesSupported=false /sp:<SQLAdmin password> /su:sqladmin

At first it was not able to find the folder 140\DAC\bin but there was a folder called 130\DAC\BIN

Unfortunately this SqlPackage file was too old to handle the BacPac file.

First I tried to update Management Studio… unfortunately that did not help. After researching a bit online I found Microsoft® SQL Server® Data-Tier Application Framework (18.2) which installed a later version of SqlPackage in C:\Program Files\Microsoft SQL Server\150\DAC\bin. That did the trick.

Links:
https://www.microsoft.com/en-us/download/confirmation.aspx?id=58207

Using RSAT for regression testing in a Tier-2 environment

We have been running an integrated pipeline for RSAT test for a while now but all of the testing has been focused towards stabilizing the tests and minimize noise and false positives. In doing this we have selected to run it against our SIT (OneBox) environment.

We have now decided to start testing for real in our UAT environment, which means that we need to redirect RSAT to connect to one of our Tier 2 environments instead. We will still use the SIT environment as a RSAT client but with an alternate configuration. This is a short step-by-step guide on how to create the new configuration.

  1. Log into the RSAT client computer and start RSAT
  2. Change the URL and SOAP URL to the:

    URL:
    SOAP URL:
  3. In LCS go to the UAT environment and export the RSAT Certificate. Remember to save the export password.
  4. Upload the file and extract it on the RSAT client machine
  5. Install the cer file to Local Machine – Trusted Root Certificate Authorities
  6. Import the .pfx file (using the password above) to the local machine

    more details are here: https://aka.ms/lcsRsatReadme
  7. In RSAT – Setting change the certificate thumbprint to the one for your Tier 2 environment.
  8. In order to use the new config but still be able to test against the SIT enviroenment, click Save As at the top of the page ang save the new configuration to a file. You can now manually switch between the two configurations and also use the different files in build and deploy pipelines.

    That is all for today… Good Luck and happy testing 🙂

Links:
Regression suite automation tool installation and configuration – Finance & Operations | Dynamics 365 | Microsoft Docs
Troubleshoot the Regression suite automation tool – Finance & Operations | Dynamics 365 | Microsoft Docs

UPDATED: Configuring access for mobile warehousing app in Dynamics 365 for Operations

Back in 2018 I wrote an article on how to configure a D365FO instance to enable the Warehousing App. A lot of time has passed and tonight I will set up the all new and improved Warehouse App so I thought I would also take the time to update the original article

  1. Go to the azure portal. In Azure Active Directory – App Registrations create a web application for the warehouse portal
    Name: WhatEverYouWant
    Who can use thisapplication or access this API: Accounts in this organizational directory only
    Application Type: Web app/API
    Sign-on URL: https://[theURLforyourdynamicsinstance]/oauth
  2. Open the application to edit it
  3. Verify Application ID. Save this for later…
  4. Go to API Permissions and click Add a permission
  5. Select the API called Microsoft Dynamics ERP (Microsoft.ERP)
  6. Choose Delegated Permissions
  7. Under Permission to other applications click add application and add Microsoft Dynamics ERP
  8. Add the following permissions
    – Access Dynamics AX online as organization users
    – Access Dynamics AX data
    – Access Dynamics AX Custom Service
    Click Add Permissions
  9. Click Grant admin consent for [Your Organization] and confirm your choice
  10. Go to Certificates and Secrets and click New client secret. Select an expiration and give the key a description like D365FO Warehousing App
  11. Log into Dynamics 365 for Operations and go to System Administration – Users and create a new user (in my case called WMAPP. The email address can be anything since it will never be used. The user needs this role:
    – Warehouse mobile device user
  12. Now we need to associate the user to the AD Application which is done on System Administration – Setup – Azure Active Directory applications. Here we paste the App ID/ClientID from before and select the user we created. Click Save and you are done.
  13. Install the App from app store and enter these settings:
    1. Azure Active Directory ID: AppID/ClientID from step 3
    2. Azure Active Directory Client Secret: The key from step 10
    3. Azure Active Directory Resource: Your Dynamics 365 URL
    4. Azure Active Directory Tennant: https://login.windows.net/yourAzureADtennant.onmicrosoft.com
    5. Company: Dynamics 365 for Operations Legal Entity
  14. To configure the mobile att you create a JSON file with the connection setting
{
    "ConnectionList": [
        {
            "ActiveDirectoryClientAppId":"11111111-2222-3333-4444-111111111111",
            "ConnectionName": "YourConnection",
            "ActiveDirectoryResource": "https://{[yourdynamicsenvironent].cloudax.dynamics.com/",
            "ActiveDirectoryTenant": "https://login.windows.net/[yourtenantid].onmicrosoft.com",
            "Company": "USMF",
            "IsEditable": true,
            "IsDefaultConnection": true,
            "ConnectionType": "clientsecret"
        }
    ]
}
  1. The JSON file can either be uploaded to the device or converted to a QR code using a service like this.
  2. Unfortunately/Thankfully the client secret cannot be put in the file/QR code and it have to be entered manually by editing the connection on the mobile device.
  3. To login in to the app you will need to have a username and password. Go to Warehouse Management – Setup – Worker – Users and select a user (in my case 24). Reset the password for worker 24
  4. Login to the app using User ID 24 and the new password you just set

Note:
To use the new Warehousing App you will need to enable a feature i D365FO called User settings, icons, and step titles for the new warehouse app which is available from 10.0.17

Links
Install and connect the warehouse app – Supply Chain Management | Dynamics 365 | Microsoft Docs

Changes to the release process for D365FO

When Microsoft upgrades your Dynamics 365 for Operations Production Environment to a Self-Service environment there are some changes that we need to be aware of.

  1. We no longer have a lead time of 5 hours before the deploy starts which is great. Mostly in those cases where you, due to bad planning, need to perform two deploys back-to-back (Yes I know that Microsoft requires us to deploy everything in a single package).
  2. The deploy maximum time goes down form 5 to 3 hours which is also awesome.
  3. This part might throw you off a bit…

    Prior to Self Service we went to the Asset Library and marked the package as a Release Candidate in order to deploy it to PROD. This has changed a bit:

a. Go to the UAT environment where you deployed the package
b. Go to History – Environment Changes
c. Select the package you want to deploy to PROD and click “Mark as release candidate”
d. Go to the PROD environment and click Maintain – Update Environment
e. Select the UAT environment where you changed the package to Release Candidate and the package will appear in the list.
f. Schedule the deploy as usual (note that you can deploy immediately if you want)

That all for today

Adding External Users as Guest Users to Azure AD

One of my colleagues had a question today… His customer has split their company in two tenants and their Dynamics 365 for Operations was still in one of the tenant (TenantA). The users in the new tenant (TenantB) needed access and he wondered how they could do this.

The first step for doing this is to create an Azure AD guest account invitation in the Azure Portal (In Tenant A). Go to Azure Active Directory – Users – New Guest User.

The user receives an email where they can approve the invite…


When the user in Tenant B accepts the invite a “placeholder account” will be created in TenantA and this account will be linked to the users account in Tenant B. The user can not log in using his account in TenantB but will be treated as a user in TenantB and the security policies from Tenant A (MFA) will be applied. Once the user has accepted the invite we need to import the user in Dynamics. Go to System Administration – Users and select Import User. Search for the user and import it.

Finally we need to ad user roles as usual.

Troubleshooting issues with 3rd party cookies in incognito mode

Today one of my colleagues complained that he was not able to log into Dynamics 365 for Finance and Operations in Incognito mode on Chrome. A couple of seconds after he logs in he is redirected to https://home.dynamics.com/proxy/error


So… He was logged in but then it failed… The reason for this issue is th

The reason for this issue is the change to 3:rd Party Cookies management Gooogle made a while back. You can fix is in one of two ways.

Either you enable saving of 3rd party cookies on Incognito mode, or you add the domain [*.]microsoftonline.com to exception list.

That is if for today… Good Luck

Johan

Links
https://community.dynamics.com/365/financeandoperations/f/dynamics-365-for-finance-and-operations-forum/363179/home-dynamics-com-proxy-signon-error

Troubleshooting Reporting Services in Dynamics 365 for Finance and Operation

Today one of my colleagues contacted me and had problems with the “Report sales tax for settlement period” report in one of our environments. It worked last friday bur not today. The error message was this:

When I looked for the Session ID in the Environment monitoring in LCS, one of the errors I got was this one:

Microsoft.Reporting.WebForms.Internal.Soap.ReportingServices2005.Execution.RSExecutionConnection+MissingEndpointException: The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version

and another one was:

TmpTaxReportById_TaxReportField on table TaxReportTmp_SE.

I already knew this was related to SSRS (Reporting Service) so obvoiously I began with restarting the Report Server, it did not help.

Since the environment is a Microsoft Hosted Tier-2 environment the first error was really strange… why should Microsoft supply an incompatible SSRS server… Not likely…

Another error I got was an authentication error… this made me think that there was something wrong with the AOS. I tried restarting the IIS Server… It helped !!!!!