Today one of my colleagues complained that he was not able to log into Dynamics 365 for Finance and Operations in Incognito mode on Chrome. A couple of seconds after he logs in he is redirected to https://home.dynamics.com/proxy/error
So… He was logged in but then it failed… The reason for this issue is th
The reason for this issue is the change to 3:rd Party Cookies management Gooogle made a while back. You can fix is in one of two ways.
Either you enable saving of 3rd party cookies on Incognito mode, or you add the domain [*.]microsoftonline.com to exception list.
Today one of my colleagues contacted me and had problems with the “Report sales tax for settlement period” report in one of our environments. It worked last friday bur not today. The error message was this:
When I looked for the Session ID in the Environment monitoring in LCS, one of the errors I got was this one:
Microsoft.Reporting.WebForms.Internal.Soap.ReportingServices2005.Execution.RSExecutionConnection+MissingEndpointException: The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version
and another one was:
TmpTaxReportById_TaxReportField on table TaxReportTmp_SE.
I already knew this was related to SSRS (Reporting Service) so obvoiously I began with restarting the Report Server, it did not help.
Since the environment is a Microsoft Hosted Tier-2 environment the first error was really strange… why should Microsoft supply an incompatible SSRS server… Not likely…
Another error I got was an authentication error… this made me think that there was something wrong with the AOS. I tried restarting the IIS Server… It helped !!!!!
When trying to set up hosted builds for D365FO in Azure DevOps according to this guide posted by Paul Heisterkamp i bumped into an issue… in one of the last steps you push the NuGet packages, downloaded from LCS, into the artefact feed that you have created.
The issue I got was when uploading the largest package (microsoft.dynamics.ax.application.devalm.buildxpp.10.0.464.13.nupkg) is that first I got a timeout an I had to set the timeout parameter in nuget.exe.
When I did that I got another error:
Error: Reponse status 503 – service unavailable
This happened after the upload had gone on for some time. Looking online I noticed that others having the same issue had discovered it to be a network issue (and in some cases proxy issues).
In order to solve the issue I uploaded the file to my D365FO dev machine and tried the push from there… it worked and it took only a couple of seconds… Wohoo…
The learnings of the day… Azure VMs has more bandwidth than I do… who knew #abitenvious
This morning our developers reached out to me telling me that we had certificate issues with two of our environments. The problem with certificates is that they have an expiry date.
Since all Dynamics environments are deployed using LCS and the Cert used is owned by Microsoft we have limited ability to fox the issue on our own. This is why Microsoft built functionality into LCS to help us with this. To fix the issue, just look up the environment in LCS, click Maintain and select Rotate Secrets
The Cert you need to fix is the SSL Certificate
Simply click Rotate SSL Cert and wait for the process to finish. In my experience you will also need to reboot the VM.
Since I am not a developer… especially not an X++ developer I am ususlly not entrusted with access to Visual Studio (this statement was more for dramatic effect but the truth is that I am trying to avoid it to not mess things up).
The main reason for this work around is that in some environments Visual Studio is not set up and to do that would require setting up the correct account, mapping workspaces and a whole lot more. This way is simpler:
Last week Microsoft finally released the complete PDFs with new features of Dynamics 365 and PowerPlatform Wave 1 2020. I browsed through them (all 405 pages of them 🙂 ) to to try to understand where we are going… here are my top 10 12.
Mass deployment of the Warehouse App This gives an organization the ability to use Microsoft Intune (or your favorite MDM solution) to deploy the warehouse app to mobile devices.
The UI will start adapting to the license the user has assigned in order to make the UI “less complex”. This means that there is no longer a possibility to get by under licensed”.
The new grid now allows you to group rows based on a column and also to easily rearrange the columns.
Tables, Entities and Aggregate measurements can be stored in Data Lagke Storage Gen 2. Prior to this release we were only able to store Aggregate Measurements in Data Lakes.
The ability to embed a Power Automate Flow directly in the UI in Dynamics. This flow can be triggered by the end user. This feature is even cooler since it was suggested on the Dynamics Ideas site.
Azure AD sign-in in the POS client. This could previously only be done using a worker ID
The ability to create a PowerApp directly in Microsoft Teams.
Great improvements for running model-driven apps offline. Really cool since this is geared towards mobile front line workers.
Monitor and get insight into usage of your canvas apps using Azure Application Insights.
Power Apps Test Studio is available in General Availability. This gives you the possibility to automate testing and integrate it into Azure DevOps release and Build Pipelines
Simplified expressions in Power Automate. Instead of having to write a complex expression for a string operation there are ready-made blocks for this.
Copy and Paste in the Power Automate designer (Yay)
As you can see this selection of features are quite tech/IT heavy… There are a LOT more app related features in the Wave 2 release. Look through the PDFs linked below to get a grasp on these.
One thing to note is that this is Wave 1… Not everything in Wave 1 is general available or in some cases even in preview… but it is coming. Something to look forward to.
I had an issue today with one of my testcases in RSAT. When running it manyally from the RSAT UI it worked but when trying to run it from a PowerShell script I got this error:
Failed to download or overwrite attachment files – Some files maybe in use.
For some reason the test case uploaded duplicate files to Azure DevOps. When I ran the test in RSAT it uses the local file generated from RSAT and everything worked but when I ran it from the console it downloaded the files from DevOps, got the duplicates and for some reason failed.
The solution was to permanently delete the test case from DevOps and resync it from BPM.
Beware that that will give it another ID which will change the order of the tests.
When running Dynamics 365 for Finance and Operations there are some legacy software built in. These are packaged as ClickOnce applications and will be downloaded every time you run them. In Internet Explorer and the old version of Microsoft Edge this worked out of the box since these are .Net aware out of the box. When is comes to the all new chromium based Microsoft Edge and Google Chrome this is not the case. In order to get these working we will need a plugin installed in the browser.
For Microsoft (Chromium) Edge To install in Edge the process is almost the same as for Google Chrome but with an additional steps prior to starting
In one of our environments we had an issue where we were not able to import a data package from a file. When trying the import it simply did not list the entities included in the package. So we tried the usual; restarting the DIXF service, Restarting the VM…. No luck.
Looking in the Event Viewer we found an error from the Microsoft Dynamics AX DIXF Service Runtime Provider:
Error In SSIS Execution
System.Exception: SYS105313 —> System.InvalidOperationException: The ‘Microsoft.ACE.OLEDB.12.0’ provider is not registered on the local machine. at System.Data.OleDb.OleDbServicesWrapper.GetDataSource(OleDbConnectionString constr, DataSourceWrapper& datasrcWrapper) at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection) at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject) at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup, DbConnectionOptions userOptions) at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection) at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions) at System.Data.ProviderBase.DbConnectionInternal.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) at System.Data.OleDb.OleDbConnection.Open() at Microsoft.Dynamics.AX.Framework.Tools.DMF.DriverHelper.DMFOdbcDriver.GetSheetNamesFromExcel(String sourceFileURL) — End of inner exception stack trace — at Microsoft.Dynamics.AX.Framework.Tools.DMF.DriverHelper.DMFOdbcDriver.GetSheetNamesFromExcel(String sourceFileURL) at Microsoft.Dynamics.AX.Framework.Tools.DMF.SSISHelperService.Service.ServiceHelper.GetSheetNamesFromExcel(String sourceFile)
I used a bit of Google-Fu and found that this is probably an issue with the Access Database engine. Since all Dynamics 365 FO VMs are deployed as ready-made VMs and we are not responsible for any installation the alternative to fixing this issue is to just redeploying the VM. In this case I tried to fix it. I downloaded this install file from Microsoft and ran a repair.
This solved the issue and re-registered the DLLs needed.
In the project I am working on right now we are maintaining two release branches. One for sprint release and one for hotfixes. Every time we release a sprint we are re-targeting the the build pipelines to point to the new branches for the next sprint. This article is a short description for where you need to change path (mostly for me to remember):
Log into Azure DevOps
Go to Pipelines – Builds
Select the Pipeline you want to change and click Edit
Go to Get Sources and change the two fields under Server Path
Select the workflow item called Build the solution and change to the correct path in Project
Save the pipeline.
Note that when you are looking at the pipeline you will not see the correct branch until you have actually run the build successfully once