Warning:[DWCE0001] Export was skipped. Max lookup count supported in Initial Sync stage is 10. Current lookup count 11

Today I was setting up the Dual-Write sync for one of my customers and I bumped into this error message:

Warning:[DWCE0001] Export was skipped. Max lookup count supported in Initial Sync stage is 10. Current lookup count 11

The issue here is that there is a limit in DataVerse that (I would guess) for performance issues there is a hard limit on a maximum of lookup fields.

In our case we had these fields in the default mapping for Customer V3 -> Account using lookup:

transactioncurrencyid.isocurrencycode
msdyn_customergroupid.msdyn_groupid
msdyn_billingaccount.accountnumber
msdyn_paymentday.msdyn_name
msdyn_customerpaymentmethod.msdyn_name
msdyn_paymentschedule.msdyn_name
msdyn_paymentterm.msdyn_name
msdyn_vendor.msdyn_vendoraccountnumber
primarycontactid.msdyn_contactpersonid
msdyn_salestaxgroup.msdyn_name
msdyn_company.cdm_companycode

The workaround is to remove one of the problematic mappings, do the initial sync and add it back. Remember to take a screenshot of the mapping that you are removing so you can put it back exactly the same.

Missing Warehouse app step instructions in D365FO

This week we have been having an issue with the warehouse at a customer. We are in the process of rolling out a new Legal entity and when moving the Legal entity to a new environment using DMF we noticed that the Warehouse app step instructions stopped working and was replaces with very cryptic labels

This issue was not present in the environment we used to set up the legal entity, but rather it occurred when we moved it.

Turns your enabling the feature (Warehouse app step instructions) sets the default steps in all existing Legal Entities but if we create a new one they are not.

To create these in a new legal entity we need to click “Create Default Steps” in the Mobile Device Steps module.

There is also no data entity for moving mobile device steps from one environment to another which might be a good idea for future improvement.

That is all for today

Finding the PC name of a POS register

In my previous post I showed you how to get the installed version of the POS software on each register. In some cases you might need to manually update this if the update has not worked correctly you will need to connect to the actual PC (or even go on site).

Unfortunately the information in D365FO does not contain the computer name. To get the computer name we need to look in LCS Telemetry. The only value I found to map between the export from D365FO and LCS is Physical Device ID. In order to be able to match the two Excel spread sheets we will need to add the field Physical Device ID to the Export from D365FO.

To export telemetry from LifeCycleServices, Log into LCS, select the environment you are looking at and click in Environment Monitoring.

The maximum number of loglines is 5000 and in order to see what we want to see we will need to limit the results a bit. First of all set Query Name to Retail Channel events. Select a time interval (10 days or so) and set log source to Retail Modern POS. To further limit the results the query return I add 1018 on the search terms. Export the information by clicking Export Grid.

Copy the entire sheet and paste it into a second tab in the D365FO Excel. Name the two tabs D365FO and Telemetry. In the Telemetry tab go to Data and click Remove Duplicates. Click Unselect All and then choose the Role Instance column and click OK.

In the D365FO tab, choose the first cell in the first empty column and paste the following in the formula field

=XLOOKUP(LEFT([@[Physical device ID]];25);Telemetry!Q:Q;Telemetry!G:G;;-1)

Note: You might need to adjust the Column name at the start of the formula in order to fit your export. The columns pointing to the Telemetry tab should be OK if you keep it standard. Fill all the cells in the columns with the same formula. This should add the PC Name to the Excel.

That is it for today 🙂

Finding current POS package version for registers

When doing a version upgrade of the POS registers in a Retail installation it might be necessary to verify if all registers have been updated correctly. If you are running a large organization with multiple hundreds of registers this needs to be done efficiently.

Start by going to Registers by entering Registers in the search box and pressinng enter.

Right-click in the white area to the right of the Registers Grid. Click Personalize: TerminalGrid and then click Add a field.

Select Currently installed package version in the Retail Devices table and click Update.
Note: If you have multiple Legal Entities you might want to add that one too

To use this information and maybe create nice graphs… Click the Office icon and choose Export to Excel – POS registers

Printing a testpage from PowerShell

Todays challenge comes from trying to troubleshoot printers om AX 2012.

Some of the printing flows are not that straight forward and might require printing from a service account. If you do not want to log in as that user to the UI or mabee you are not able to you can use PowerShell to do a test print from the account. This functionality is not available directly in Windows but I found a PS function that does this which I modified a bit (original links below).

This function uses WMIC to list all printers and then send a test print to the printer. You can also use it without modifying the $printername by adding the parameter -printername.

Function out-TestPage
{
Param(
  [string]$printername = “\\printserver\printername”)

  $Printers = Get-CimInstance -ClassName Win32_Printer
  $Printer = $Printers | Where-Object Name -eq "$printername"
 Invoke-CimMethod -MethodName printtestpage -InputObject ($printer)
 Write-Host "Printing to $($printer).Name"
}

To run this as the service account, simply launch PowerShell och Powershell ISE as the account and run the Function Block. Then call the function using out-TestPage -printername “[the name of your printer]”

Links
Use PowerShell to Send Test Page to a Printer – Scripting Blog (microsoft.com)
[SOLVED] Print Test Page from PS – Printers & Scanners – Spiceworks

Article: DynamicsCon Fall 2021 Preview: Why microservices matter to the future of Dynamics 365 for Finance and Operations

Microsoft Dynamics 365 for Finance and Operations (D365FO) is Microsoft’s latest iteration of their flagship ERP system that used to be called Dynamics AX (and before that Axapta). Over all those years of software evolution, the ongoing goal has been to build a modern ERP system that is easier to support, maintain, and innovate on with new types of application development.

With D365FO, Microsoft has introduced many important changes that modernize the ERP experience and make it more attractive to customers. Key differences from the AX era include a new model for development, a One-Version policy for periodic updates, and an evolving approach to adding features to the ERP system.

For those of you who are already using D365FO, you have probably noticed feature management, where new features appear after updating to the latest version. Most new features represent new or improved capabilities that are based on what was already in the product, but sometimes a large new feature appears that provides a significant new tool or something that has been totally rebuilt from scratch.

When Microsoft create these large changes, they are probably thinking (I do not work for Microsoft so I cannot be sure) “Should this really become part of the core product?”. There are some historic examples of this type of decision making from the product team: The retail components in Dynamics AX 2012, integrations in AX moved to an external integration platform, and using SSRS instead of a built-in reporting engine.

Today, Microsoft are continuing to move traditional features out of D365FO and deploy modernized versions of them as cloud services that are accessible from the core ERP with APIs. The first one that became available was resource planning. The built-in MRP functionality has been deprecated and completely removed from the core product and it is instead available as a plugin to install from LCS. Another example is the configuration of exports to Data Lake (which will replace the Entity Store). In fact, Data Lake and Dataverse are an integral part in other microservice integrations.

A lot of these services are in preview now and will soon be available. A few other examples include Finance Insights, Tax Calculation, Electronic Invoicing, and Expense Management. I suspect we will be seeing a lot more of these in the future.

There are many reasons to why this change is happening right now. When Microsoft “moved” AX to the cloud, “AX7” was mostly Dynamics AX 2012 R3 with a web interface. In the five or so years since then, we have seen a lot of change. The entire architecture has changed from a traditional on-prem software stack with a database and a Win32 frontend to a containerized solution with near zero-downtime maintenance. In order to improve the product even more, reduce the complexity, increase performance, and improve resilience to developing customizations, Microsoft are building a more modularized solution that scales better and does not affect the speed of the core product. We see this clearly in MRP where a resource planning job that used to take multiple hours now finishes in 15 minutes.

Another upside of this design is that these services can be updated without having to “respect” customizations made by customers and partners outside of Microsoft’s control. With just a lightweight interface for all parties to respect, Microsoft is free to iterate even faster than the monthly cadence of One-Version.

We are also seeing customers and ISV vendors using the same methodology when building code… “If it doesn’t need to be in the ERP system, we should probably move it out”. If you look in Microsoft Azure, you will also notice that there is a whole slew of services available for building these modular solutions.

If you want to learn more about these services, why they exist, how to use them and whether you can build your own; please register for DynamicsCon and see my session called “Why Micro Services in Dynamics 365 for Finance and Operations”. I will be there for the session and of course to answer all your questions.

The article was originally posted in MSDynamicsWorld on Aug 20:th 2021

Build Pipelines with D365FO version 10.0.21

Hi all

Yesterday Microsoft released the PEAP version of 10.0.21 and on the new VM images they have decided to update a whole lot of stuff (Yay!!).

The new VMs are based on Windows Server 2019, SQL Server 2019 and Visual Server 2019 which means that your existing pipelines in Azure Devops will not work any more if you do not do some minor changes. I basically made some adjustments to this article by Joris de Gruyter and added a final step to mitigate a SQL server issue

  • In the Build Solution change MSBuild Version to MSBuild 16.0
  • In the Database Sync step change MSBuild Verson to MSBuild 16.0
  • In the Execute Tests step change Test Platform version till Visual Studio 2019
  • In the script “C:\DynamicsSDK\DeploySSRSReports.ps1” on line 127, change
    Restart-Service -Name “ReportServer” -Force
    to
    Restart-Service -Name “SQLServerReportingServices” -Force
That´s it... the SQL issue will most certanly be fixed in the released version of the VMs        

Links
Updating The Legacy Pipeline for Visual Studio 2017 (codecrib.com)

Package Management in Windows

One of my main envies of Linux in the past years have been Apt-Get – the solution for managing installs and updates of software packages. There have been a couple of different solutions available for Windows historically. The best known one is a third party solution called Chokolatey that you can install and use on Windows and which has a huge repository of software available. The issue I had with Chokolatey was that is was not built-in to Windows… It felt a bit off having to install software to be able to install software.

A couple of years ago Microsoft included OneGet in Microsoft Powershell and I tried is a couple of times but being a bit lazy as I am I always felt it was a bit over-complicated. I never got the hang of having to install an trust providors and repositories. Since I mainly do this once when I reinstall a computer I never really found it worth the time to learn how to do this.

Fast forward to May 2020 when Microsoft introduced Winget. It was first made available as an install from Windows Store or directly from Github and from around May 2021 it is included in Windows as a default with no extra install required.

Yesterday when I reinstalled my computer I thought I would give is a shot.

To find software you use winget search. You can for instance type “winget search microsoft.” (Note the . at the end) to see all Microsoft software in the repository.

When you see the list of Microsoft packages you see that a lot of the regular downloadable packages such as Powershell and OneDrive. You will also find Microsoft Office, Teams and Visual Studio. All of Microsoft’s redistributable packges for supporting .NET and C++ are also available if you have any pre-requirement packages. A lot of the software might require a license which you will have to provide… when you for instance install Office you log into it as usual to provide your license informaton.

To install software you use the command winget install. You could for instance use winget install PowerToys to install Microsoft PowerToys or winget install “Microsoft vscode-insiders” to install Visual Studio Code Insider version. Note that you can use any of the information in the search results to identify which package should be installed.

Winget also handles the update of packages. To upgrade a specific package you can use winget upgrade [package name] or just use winget upgrade to upgrade all packages.

I really like what I see so far… 🙂

That is all for today

Fixing: The model version ‘3.6’ is not supported

When doing a refresh today of one of our D365FO Dev environments I encountered an issue that puzzled me a bit. I ran the following script to import the BacPac file:

cd C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin

SqlPackage.exe /a:export /ssn:<server>.database.windows.net /sdn:MyNewCopy /tf:D:\Exportedbacpac\my.bacpac /p:CommandTimeout=1200 /p:VerifyFullTextDocumentTypesSupported=false /sp:<SQLAdmin password> /su:sqladmin

At first it was not able to find the folder 140\DAC\bin but there was a folder called 130\DAC\BIN

Unfortunately this SqlPackage file was too old to handle the BacPac file.

First I tried to update Management Studio… unfortunately that did not help. After researching a bit online I found Microsoft® SQL Server® Data-Tier Application Framework (18.2) which installed a later version of SqlPackage in C:\Program Files\Microsoft SQL Server\150\DAC\bin. That did the trick.

Links:
https://www.microsoft.com/en-us/download/confirmation.aspx?id=58207