In Dynamics 365 For Finance and Supply Chain we rely heavily on Azure DevOps for managing aspects of our projects, especially development and deploying packages.
Today a customer told me that they hit the roof on the free/included alotment for parallell build in Azure Hosted Pipelines. They wanted to understand what it had been used for. I found a very handy new preview feature called Historical Views for Pipelines.
To turn it on you you click on the settings icon in the top right corner and click Preview Features. The feature can be turned on for single users or entire organizations.
This feature will help you understand your usage.
The first thing you need to in order to purchase more parallel jobs is to set up billing for your Azure DevOps organization. Go to Organization Setting – Billing. The billing options are the same ones that you are using for “regular” Azure.
After you have set up billing, go to Organization Settings – Parallel Jobs and select Purchase Parallel jobs.
The pricing for additional paralell jobs are $40/month which gives you 1 paralell job with unlimited minutes.
Yesterday Microsoft released the PEAP version of 10.0.21 and on the new VM images they have decided to update a whole lot of stuff (Yay!!).
The new VMs are based on Windows Server 2019, SQL Server 2019 and Visual Server 2019 which means that your existing pipelines in Azure Devops will not work any more if you do not do some minor changes. I basically made some adjustments to this article by Joris de Gruyter and added a final step to mitigate a SQL server issue
In the Build Solution change MSBuild Version to MSBuild 16.0
In the Database Sync step change MSBuild Verson to MSBuild 16.0
In the Execute Tests step change Test Platform version till Visual Studio 2019
In the script “C:\DynamicsSDK\DeploySSRSReports.ps1” on line 127, change Restart-Service -Name “ReportServer” -Force to Restart-Service -Name “SQLServerReportingServices” -Force
That´s it... the SQL issue will most certanly be fixed in the released version of the VMs
We have been running an integrated pipeline for RSAT test for a while now but all of the testing has been focused towards stabilizing the tests and minimize noise and false positives. In doing this we have selected to run it against our SIT (OneBox) environment.
We have now decided to start testing for real in our UAT environment, which means that we need to redirect RSAT to connect to one of our Tier 2 environments instead. We will still use the SIT environment as a RSAT client but with an alternate configuration. This is a short step-by-step guide on how to create the new configuration.
Log into the RSAT client computer and start RSAT
Change the URL and SOAP URL to the:
URL: SOAP URL:
In LCS go to the UAT environment and export the RSAT Certificate. Remember to save the export password.
Upload the file and extract it on the RSAT client machine
Install the cer file to Local Machine – Trusted Root Certificate Authorities
Import the .pfx file (using the password above) to the local machine
more details are here: https://aka.ms/lcsRsatReadme
In RSAT – Setting change the certificate thumbprint to the one for your Tier 2 environment.
In order to use the new config but still be able to test against the SIT enviroenment, click Save As at the top of the page ang save the new configuration to a file. You can now manually switch between the two configurations and also use the different files in build and deploy pipelines.
That is all for today… Good Luck and happy testing 🙂
I am making heavy use of DevOps queries in my daily tasks as release manager/scrum master at my customer. Especially when it comes to generating release notes.
Currently I have a query returning all DevOps Items, Ready for Release and tagged with Hotfix. I open them in Excel, format them a bit and paste them into ours Microsoft Teams Wiki. Excel really makes this table-tweaking much easier.
Today Excel kept crashing when I used Open in Excel and I could not figure out why. I used it successfully last week. After some googling I bumped into a forum post that told me to clear cache folders for Team Foundation Service. It helped !! Yay!!!
A couple of days ago I reinstalled my computer and since I usually go by “Newer is always Better” I installed Visual Studio 2019. When I was going to generate release notes for our latest release Excel Addin for DevOps did not work. I looked through the prerequisites and found nothing I missed… It should work… WTF
When trying to set up hosted builds for D365FO in Azure DevOps according to this guide posted by Paul Heisterkamp i bumped into an issue… in one of the last steps you push the NuGet packages, downloaded from LCS, into the artefact feed that you have created.
The issue I got was when uploading the largest package (microsoft.dynamics.ax.application.devalm.buildxpp.10.0.464.13.nupkg) is that first I got a timeout an I had to set the timeout parameter in nuget.exe.
When I did that I got another error:
Error: Reponse status 503 – service unavailable
This happened after the upload had gone on for some time. Looking online I noticed that others having the same issue had discovered it to be a network issue (and in some cases proxy issues).
In order to solve the issue I uploaded the file to my D365FO dev machine and tried the push from there… it worked and it took only a couple of seconds… Wohoo…
The learnings of the day… Azure VMs has more bandwidth than I do… who knew #abitenvious
Today when writing release notes for my customers latest release I bumped into an issue. I have a query that returns all item in the current Iteration Path with their release notes. The release notes are written in Microsoft Word so I thought that if I could just get the query into Excel it would be an easy “Copy and Past” operation into word.
Fortunately Azure DevOps have just this feature. You simply go to queries, click the three dots negt to the query and select Open in Excel
I had an issue today with one of my testcases in RSAT. When running it manyally from the RSAT UI it worked but when trying to run it from a PowerShell script I got this error:
Failed to download or overwrite attachment files – Some files maybe in use.
For some reason the test case uploaded duplicate files to Azure DevOps. When I ran the test in RSAT it uses the local file generated from RSAT and everything worked but when I ran it from the console it downloaded the files from DevOps, got the duplicates and for some reason failed.
The solution was to permanently delete the test case from DevOps and resync it from BPM.
Beware that that will give it another ID which will change the order of the tests.
In the project I am working on right now we are maintaining two release branches. One for sprint release and one for hotfixes. Every time we release a sprint we are re-targeting the the build pipelines to point to the new branches for the next sprint. This article is a short description for where you need to change path (mostly for me to remember):
Log into Azure DevOps
Go to Pipelines – Builds
Select the Pipeline you want to change and click Edit
Go to Get Sources and change the two fields under Server Path
Select the workflow item called Build the solution and change to the correct path in Project
Save the pipeline.
Note that when you are looking at the pipeline you will not see the correct branch until you have actually run the build successfully once