Unable to find my ER Data Package

A wee k or so ago, one of my colleagues contacted me from one of our customers. She was setting up VAT reporting for Germany and in the instructions it says that in order to set it up you will need to import a data package from the LCS Shared Asset Library. When she goes to the Shared Asset Library in LCS it is completely empty.


It turns out that this customer is in the EU LCS tenant and Microsoft has not populated the Shared Asset Library in EU. To get to the files there is a fairlysimple solution:

  1. Before you log into LCS, make sure you are in the United States region


  2. Log in to LCS. Note: You will not see your project in the list of projects. It will be empty. Don’t worry, it is by design.
  3. Go to the Shared Asset Library – Data Packages. Find the package you are looking for, click it to download it.
  4. Log out from LCS, switch region and log in again.
  5. If you want it in you projects Asset Library, go to the Asset Library an Upload the package.

    That is all for today, good luck 🙂

Issues with DBsync step during deploy


Today, when I was deploying customization package to a newly deployed config environment, I had an issue with a step not working correctly. The environment had not yet been used for anything so I hadn´t even copied a database to it. When I deployed the customization package to it I got the following error in the runbook log and the deploy failed:

Table Sync Failed for Table: SQLDICTIONARY. Exception: System.NotSupportedException: TableID not yet generated for table: AmcBankReconciliations

The sync step in the runbook is failing because there is no TableID for the table AmcBankReconciliations. And I thought that was exactly what the sync process was supposed to do (??).

Having no clue about why this happened I first turned to Google (as one does) and when I could not find anything there I asked my awesome colleagues and one of the said:

“I have seen newly deployed environments behaving strangely and my solution usually is to start Visual Studio and perform a DB Sync”

This was a bit strange since it was the Sync Step that failed but I thought I would give it a try. Since this was a config environment that is not going to use Visual Studio, I instead opted for using the amazing [d365fo.tools](GitHub – d365collaborative/d365fo.tools: Tools used for Dynamics 365 Finance and Operations) to do the sync

Invoke-D365DBSync -Verbose

When the sync had finished I tried resuming the deploy and to my surprise it finished perfectly… Nice 🙂

Not able to activate Data Events for entities

Yesterday I looked into an issue for one of my customers. In one of their environments they where not able to activate Data Events for any of their entities. The Activate button was completely grayed out and the “Active Data Events” and “Inactive Data Events” tabs did not exist

This is an environment where we are currently running a PoC for DualWrite so I just “assumed” that I had configured PowerPlatform correctly (you know what we say about assumptions)

Turns out I had not followed my own advice and done the PowerPlatform configuration from LCS… Instead I did the linking from within FnO, in the DataManagement workspace.

The solution was to go through the linking wizard in LCS and then the tabs showed up



Note that the change to the Business Events Screen is not instant… I had to have it sitting over night to have the tabs show up

Links:
Figuring out DataVerse and DualWrite in Dynamics 365 FnO – JohanPersson.nu

Azure AD and Elevated Access

Today one of my colleagues contacted me around help with Authenticating his LCS project with our Azure AD. He had created his own subscription since he had no access to out top level tenant. When I went into the Azure Portal to look for the subscription I was not able to find it, which was a bit strange since I have the Global Admin Role.

Doing some research I found that there is something called Azure Elevated Access which is sort of UAC for Azure AD. This mean that even if you have Global Admin Access you will not be able to see everything you do not have specific access to. You will need to elevate your permissions in order to see everything.

This can be very useful in order or get access to subscriptions created in your Azure AD tenant that was created by someone else, maybe even someone that has left your organization.

In the Azure Portal, go to Azure AD. Select properties in the left side menu. At the bottom of the page there is a toggle switch called Azure Management for Azure Resources. While the switch is set to yes you are able to override permissions and set new ones.

Remember to set it back when you are done…

Links
Elevate access to manage all Azure subscriptions and management groups | Microsoft Docs

Changes to the release process for D365FO

When Microsoft upgrades your Dynamics 365 for Operations Production Environment to a Self-Service environment there are some changes that we need to be aware of.

  1. We no longer have a lead time of 5 hours before the deploy starts which is great. Mostly in those cases where you, due to bad planning, need to perform two deploys back-to-back (Yes I know that Microsoft requires us to deploy everything in a single package).
  2. The deploy maximum time goes down form 5 to 3 hours which is also awesome.
  3. This part might throw you off a bit…

    Prior to Self Service we went to the Asset Library and marked the package as a Release Candidate in order to deploy it to PROD. This has changed a bit:

a. Go to the UAT environment where you deployed the package
b. Go to History – Environment Changes
c. Select the package you want to deploy to PROD and click “Mark as release candidate”
d. Go to the PROD environment and click Maintain – Update Environment
e. Select the UAT environment where you changed the package to Release Candidate and the package will appear in the list.
f. Schedule the deploy as usual (note that you can deploy immediately if you want)

That all for today

Troubleshooting Reporting Services in Dynamics 365 for Finance and Operation

Today one of my colleagues contacted me and had problems with the “Report sales tax for settlement period” report in one of our environments. It worked last friday bur not today. The error message was this:

When I looked for the Session ID in the Environment monitoring in LCS, one of the errors I got was this one:

Microsoft.Reporting.WebForms.Internal.Soap.ReportingServices2005.Execution.RSExecutionConnection+MissingEndpointException: The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version

and another one was:

TmpTaxReportById_TaxReportField on table TaxReportTmp_SE.

I already knew this was related to SSRS (Reporting Service) so obvoiously I began with restarting the Report Server, it did not help.

Since the environment is a Microsoft Hosted Tier-2 environment the first error was really strange… why should Microsoft supply an incompatible SSRS server… Not likely…

Another error I got was an authentication error… this made me think that there was something wrong with the AOS. I tried restarting the IIS Server… It helped !!!!!

Configuring a LCS Repository for Electronic Reporting in Dynamics 365 for Finance and Operations

In Dynamics 365 for Finance and Operations, Microsoft provides a large repository for Electronic Reports. These cover a lot of scenarios but for those scenarios that requires custom reports you can use LCS to store these in order to share them between environments.

This requires a configuration in D365

First you need to add a configuration provider:

  1. In D365FO search for “Electronic Reporting”
  2. In the right click Configuration Providers
  3. Click New, Give it a Name and a URL and click save
  4. Back in the GER Workspace select the new provider and click Repositories
  5. On the Configuration repositories screen click Add – LCS and Create Repository. Select the correct LCS project and click OK.
    Note: Sometimes you need to make a connection to LCS. To do that click “Click here to connect to LifeCycle Services”. This is done in a separate tab. When it is done switch back to the previous tab and click OK. If you get an error, try again.

    EDIT 2019-06-03 If you are not seeing the the option to select an LCS project at all, you will need to make a connection between Dynamics 365 FO and LCS. This is done in the System Administration – Setup – System Parameters – Help Tab.

    When you click the Help Tab you will see the option to authenticate to LCS. Click “Click here to connect to LifeCycle Services”. Authiorization is done in another tab.

    Close the new tab and click OK on the previous tab.

    Select your LCS project

    You will need to have a user in LCS that uses the same Azure AD credentials as the one you are logged into D365FO with.
    END EDIT

  6. To look at the reports in the repository, select it and click Open. You will see the reports stored in the LCS repository.

     

     

     

    7. You are also able to see them directly in LCS, in the Asset Library. If you are working as a consultant this report can then be saved to your personal Asset Library and shared with your colleagues

    8. To give your organization access to your custom report, go to the Shared Asset Library, select it and click Publish

     

     

Detailed Version Information in LCS

When contacting Microsoft Support about Dynamics 365 for Operations they often ask you for the version information for an environment. We have been having an issue with seeing it for our Cloud Hosted Environments… it simply was not there:

SNAGHTML9e3aa91

I finally found a way to fix this and it is Very Simple (onece you know it)

Go to the adress bar in your browser… At here end of your URL you will find:

&IsDiagnosticsEnabledEnvironment=false

change it to:

&IsDiagnosticsEnabledEnvironment=true

and presto!!!

SNAGHTML9e78d61

/Johan

Error in Environment Reprovisioning tool

I have been working with an upgrade of Dynamics 365 for Operations now for a while and found an issue when it comes to Environment Reprovisioning tool, also known as Retail Retargeting Tool. This is a tool used to fix the retail conponents in an environment after you have done a database copy to the environment from another environment. We did this as part of the upgrade to Platfor Update 10.

When we ran the tool from the command prompt in the environemnt we gor the following error:

The step completed
Executing step: 3
GlobalUpdate script for service model: RetailServer on machine: localhost
Run RetargetRetailServer.ps1
RetargetRetailServer.ps1 failed.
The step failed
The step: 3 is in failed state, you can use rerunstep command to debug the step explicitly
at Microsoft.Dynamics.AX.AXUpdateInstallerBase.RunbookExecutor.executeRunbookStepList(RunbookData runbookData, List`1 runbookStepList, String updatePackageFilePath, Boolean silent, String stepID, ExecuteStepMode executeStepMode, Boolean versionCheck, Boolean restore, Parameters parameters)
at Microsoft.Dynamics.AX.AXUpdateInstallerBase.RunbookExecutor.executeRunbook(RunbookData runbookData, String updatePackageFilePath, Boolean silent, String stepID, ExecuteStepMode executeStepMode, Boolean versionCheck, Boolean restore, Parameters parameters)
at Microsoft.Dynamics.AX.AXUpdateInstallerBase.AXUpdateInstallerBase.execute(String runbookID, Boolean silent, String updatePackageFilePath, IRunbookExecutor runbookExecutor, Boolean versionCheck, Boolean restore)
at Microsoft.Dynamics.AX.AXUpdateInstaller.Program.InstallUpdate(String[] args)
at Microsoft.Dynamics.AX.AXUpdateInstaller.Program.Main(String[] args)

So this error was not that obvious to me… So I took a look at the loggs for the Deployable package which are located in the RunbookWorkingFolder (located in the folder where you extracted the package). There I found this:

Exception : System.Management.Automation.RuntimeException: The servicing data of this box has not been migrated yet, please rerun this tool with axlocaladmin
TargetObject : The servicing data of this box has not been migrated yet, please rerun this tool with axlocaladmin

This made me think… The older environments always had an account called axlocaladmin but the newer ones does not. After some digging around in the script called RetargetRetailServer.ps1 I found this (on line 302) :

if($env:UserName -ne 'axlocaladmin')
{
    $errorMessage = "The servicing data of this box has not been migrated yet, please rerun this tool with axlocaladmin"
    Log-TimedMessage $errorMessage
    throw $errorMessage
}

I simply changed it to:

if($env:UserName -ne 'adminXXXXXXXXXX')
{
    $errorMessage = "The servicing data of this box has not been migrated yet, please rerun this tool with axlocaladmin"
    Log-TimedMessage $errorMessage
    throw $errorMessage
    
}

where adminXXXXXXXXXX is the account name for my user and the i re-ran:

AXUpdateInstaller.exe execute -runbookid=env-reprovision -rerunstep=3

And presto… it worked 🙂

Note that this is the reprovisioning tool from 8/30/2017, I have not looked at any other verisons.

/Johan

Activating Dynamincs AX 2012 VMs deployed to Azure from LCS

Our company has retired all of our lab hosts which we used to have internally which means thet we need to have our lab servers on Microsoft Azure. To install the servers we use LCS (Lice Cycle Services) which deploys a VM in our Azure Subscription. The problem is that the VM is not activated, which on an internal server would require a product key and a Windows Server License… When running VMs internally we used to handle this by rearming the VMs 3 times which gives us a total of 180 days and then set up a new one which was a little hassle but it worked. But since we now have deployed VMs on Azure there is actually a license included in the Azure VM which means that there is no need to run a server which is not activated. Here is a short description on how to activate the AX VM…

  1. Install the new VM on Microsoft Azure using LCS
  2. Log into the the server using RDP (the logon info is in LCS)
  3. Start an elevated command prompt
  4. Find the edition of the VM by running this command:

    DISM /online /Get-CurrentEdition

    We get: Current Edition : ServerDatacenterEval

  5. Find out which edition you can upgrade to by running:

    DISM /online /Get-TargetEditions

    We get: Target Edition : ServerDatacenter

    This means that you can upgrade from the evaluation edition of datacenter edition to the full version of datacenter edition. Now we need a license key… Microsoft uses Automatic Virtual Machine Activation to license the VMs in Azure… This means that if the host is activated (which it hopefully is Smile ) the guest gets activated but to use this feature the guest VM still need a product key. The keys are available on Technet.

  6. Use the correct key to activate the VM by running this:

    DISM /online /Set-Edition:ServerDatacenter /ProductKey:XXXXX-XXXXX-XXXXX-XXXXX-XXXXX /AcceptEULA

  7. Restart the VM and verify that is is not using Evaluation Edition anymore

    image

That’s all for today

Links:

https://technet.microsoft.com/en-us/library/dn303421.aspx