We have been running an integrated pipeline for RSAT test for a while now but all of the testing has been focused towards stabilizing the tests and minimize noise and false positives. In doing this we have selected to run it against our SIT (OneBox) environment.
We have now decided to start testing for real in our UAT environment, which means that we need to redirect RSAT to connect to one of our Tier 2 environments instead. We will still use the SIT environment as a RSAT client but with an alternate configuration. This is a short step-by-step guide on how to create the new configuration.
Log into the RSAT client computer and start RSAT
Change the URL and SOAP URL to the:
URL: SOAP URL:
In LCS go to the UAT environment and export the RSAT Certificate. Remember to save the export password.
Upload the file and extract it on the RSAT client machine
Install the cer file to Local Machine – Trusted Root Certificate Authorities
Import the .pfx file (using the password above) to the local machine
more details are here: https://aka.ms/lcsRsatReadme
In RSAT – Setting change the certificate thumbprint to the one for your Tier 2 environment.
In order to use the new config but still be able to test against the SIT enviroenment, click Save As at the top of the page ang save the new configuration to a file. You can now manually switch between the two configurations and also use the different files in build and deploy pipelines.
That is all for today… Good Luck and happy testing 🙂
Back in 2018 I wrote an article on how to configure a D365FO instance to enable the Warehousing App. A lot of time has passed and tonight I will set up the all new and improved Warehouse App so I thought I would also take the time to update the original article
Go to the azure portal. In Azure Active Directory – App Registrations create a web application for the warehouse portal Name: WhatEverYouWant Who can use thisapplication or access this API: Accounts in this organizational directory only Application Type: Web app/API Sign-on URL: https://[theURLforyourdynamicsinstance]/oauth
Open the application to edit it
Verify Application ID. Save this for later…
Go to API Permissions and click Add a permission
Select the API called Microsoft Dynamics ERP (Microsoft.ERP)
Choose Delegated Permissions
Under Permission to other applications click add application and add Microsoft Dynamics ERP
Add the following permissions – Access Dynamics AX online as organization users – Access Dynamics AX data – Access Dynamics AX Custom Service Click Add Permissions
Click Grant admin consent for [Your Organization] and confirm your choice
Go to Certificates and Secrets and click New client secret. Select an expiration and give the key a description like D365FO Warehousing App
Log into Dynamics 365 for Operations and go to System Administration – Users and create a new user (in my case called WMAPP. The email address can be anything since it will never be used. The user needs this role: – Warehouse mobile device user
Now we need to associate the user to the AD Application which is done on System Administration – Setup – Azure Active Directory applications. Here we paste the App ID/ClientID from before and select the user we created. Click Save and you are done.
Install the App from app store and enter these settings: 1. Azure Active Directory ID: AppID/ClientID from step 3 2. Azure Active Directory Client Secret: The key from step 10 3. Azure Active Directory Resource: Your Dynamics 365 URL 4. Azure Active Directory Tennant: https://login.windows.net/yourAzureADtennant.onmicrosoft.com 5. Company: Dynamics 365 for Operations Legal Entity
To configure the mobile att you create a JSON file with the connection setting
When Microsoft upgrades your Dynamics 365 for Operations Production Environment to a Self-Service environment there are some changes that we need to be aware of.
We no longer have a lead time of 5 hours before the deploy starts which is great. Mostly in those cases where you, due to bad planning, need to perform two deploys back-to-back (Yes I know that Microsoft requires us to deploy everything in a single package).
The deploy maximum time goes down form 5 to 3 hours which is also awesome.
This part might throw you off a bit…
Prior to Self Service we went to the Asset Library and marked the package as a Release Candidate in order to deploy it to PROD. This has changed a bit:
a. Go to the UAT environment where you deployed the package b. Go to History – Environment Changes c. Select the package you want to deploy to PROD and click “Mark as release candidate” d. Go to the PROD environment and click Maintain – Update Environment e. Select the UAT environment where you changed the package to Release Candidate and the package will appear in the list. f. Schedule the deploy as usual (note that you can deploy immediately if you want)
One of my colleagues had a question today… His customer has split their company in two tenants and their Dynamics 365 for Operations was still in one of the tenant (TenantA). The users in the new tenant (TenantB) needed access and he wondered how they could do this.
The first step for doing this is to create an Azure AD guest account invitation in the Azure Portal (In Tenant A). Go to Azure Active Directory – Users – New Guest User.
The user receives an email where they can approve the invite…
When the user in Tenant B accepts the invite a “placeholder account” will be created in TenantA and this account will be linked to the users account in Tenant B. The user can not log in using his account in TenantB but will be treated as a user in TenantB and the security policies from Tenant A (MFA) will be applied. Once the user has accepted the invite we need to import the user in Dynamics. Go to System Administration – Users and select Import User. Search for the user and import it.
Today one of my colleagues complained that he was not able to log into Dynamics 365 for Finance and Operations in Incognito mode on Chrome. A couple of seconds after he logs in he is redirected to https://home.dynamics.com/proxy/error
So… He was logged in but then it failed… The reason for this issue is th
The reason for this issue is the change to 3:rd Party Cookies management Gooogle made a while back. You can fix is in one of two ways.
Either you enable saving of 3rd party cookies on Incognito mode, or you add the domain [*.]microsoftonline.com to exception list.
Today one of my colleagues contacted me and had problems with the “Report sales tax for settlement period” report in one of our environments. It worked last friday bur not today. The error message was this:
When I looked for the Session ID in the Environment monitoring in LCS, one of the errors I got was this one:
Microsoft.Reporting.WebForms.Internal.Soap.ReportingServices2005.Execution.RSExecutionConnection+MissingEndpointException: The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version
and another one was:
TmpTaxReportById_TaxReportField on table TaxReportTmp_SE.
I already knew this was related to SSRS (Reporting Service) so obvoiously I began with restarting the Report Server, it did not help.
Since the environment is a Microsoft Hosted Tier-2 environment the first error was really strange… why should Microsoft supply an incompatible SSRS server… Not likely…
Another error I got was an authentication error… this made me think that there was something wrong with the AOS. I tried restarting the IIS Server… It helped !!!!!
When trying to set up hosted builds for D365FO in Azure DevOps according to this guide posted by Paul Heisterkamp i bumped into an issue… in one of the last steps you push the NuGet packages, downloaded from LCS, into the artefact feed that you have created.
The issue I got was when uploading the largest package (microsoft.dynamics.ax.application.devalm.buildxpp.10.0.464.13.nupkg) is that first I got a timeout an I had to set the timeout parameter in nuget.exe.
When I did that I got another error:
Error: Reponse status 503 – service unavailable
This happened after the upload had gone on for some time. Looking online I noticed that others having the same issue had discovered it to be a network issue (and in some cases proxy issues).
In order to solve the issue I uploaded the file to my D365FO dev machine and tried the push from there… it worked and it took only a couple of seconds… Wohoo…
The learnings of the day… Azure VMs has more bandwidth than I do… who knew #abitenvious
This morning our developers reached out to me telling me that we had certificate issues with two of our environments. The problem with certificates is that they have an expiry date.
Since all Dynamics environments are deployed using LCS and the Cert used is owned by Microsoft we have limited ability to fox the issue on our own. This is why Microsoft built functionality into LCS to help us with this. To fix the issue, just look up the environment in LCS, click Maintain and select Rotate Secrets
The Cert you need to fix is the SSL Certificate
Simply click Rotate SSL Cert and wait for the process to finish. In my experience you will also need to reboot the VM.