One of my main envies of Linux in the past years have been Apt-Get – the solution for managing installs and updates of software packages. There have been a couple of different solutions available for Windows historically. The best known one is a third party solution called Chokolatey that you can install and use on Windows and which has a huge repository of software available. The issue I had with Chokolatey was that is was not built-in to Windows… It felt a bit off having to install software to be able to install software.
A couple of years ago Microsoft included OneGet in Microsoft Powershell and I tried is a couple of times but being a bit lazy as I am I always felt it was a bit over-complicated. I never got the hang of having to install an trust providors and repositories. Since I mainly do this once when I reinstall a computer I never really found it worth the time to learn how to do this.
Fast forward to May 2020 when Microsoft introduced Winget. It was first made available as an install from Windows Store or directly from Github and from around May 2021 it is included in Windows as a default with no extra install required.
Yesterday when I reinstalled my computer I thought I would give is a shot.
To find software you use winget search. You can for instance type “winget search microsoft.” (Note the . at the end) to see all Microsoft software in the repository.
When you see the list of Microsoft packages you see that a lot of the regular downloadable packages such as Powershell and OneDrive. You will also find Microsoft Office, Teams and Visual Studio. All of Microsoft’s redistributable packges for supporting .NET and C++ are also available if you have any pre-requirement packages. A lot of the software might require a license which you will have to provide… when you for instance install Office you log into it as usual to provide your license informaton.
To install software you use the command winget install. You could for instance use winget install PowerToys to install Microsoft PowerToys or winget install “Microsoft vscode-insiders” to install Visual Studio Code Insider version. Note that you can use any of the information in the search results to identify which package should be installed.
Winget also handles the update of packages. To upgrade a specific package you can use winget upgrade [package name] or just use winget upgrade to upgrade all packages.
Denna gång har jag fått låna ett par Jabra Elite 85t och testat dem under ett par veckor. Till viss del kommer denna test att använda de andra som referens och jag kommer att jämföra med dem.
Jabra Elite 85t är alltså den senaste varianten av Jabras true wireless lurar. De levereras som standard med ANC med hjälp av hårdvara (till skillnad från 75t) och laddning caset.
Casen som levereras med 85t är något större än den till 75t men det är marginellt. Batteritiden för hörlurarna är 7 timmar (totalt 31 timmar med fordralet). Laddningsfordralet har en USB-C kontakt för laddning. Anledningen till den är något större är att att även kan laddas trådlöst. Casen har både magnetisk stängning och håller även fast lurarna med magneterna vilket gör att de lätt snäpper på plats för att laddas.
Brusreduseringen i lurarna är nästan obehagligt bra, om man promenerar utomhus i närheten av trafik dämpar de trafikljuden nästa helt, det enda jag märkt att den har lite problem med är vindsus. Som jag skrev i recensionen av ANC på 75t kan jag ibland tycka att bra ANC ibland blir lite jobbigt i en lugn miljö. Det blir som ett vacuum. Jabra har även designat om gummipropparna som sitter i hörselkanalen så att de blivit ovala. Detta har man gjort för att man skall undvika sugkänslan man ibland upplever med inear proppar och för att undvika att dunkande ljud, som uppstår när man går och springer, förstärks.
Det finns ett par nackdelar med 85t, de är mest relaterade till mina egna preferenser. Den första är att de är något större än 75t vilket gör att de sticker ut lite mer från huvudet. De andra nackdelen är att till skillnad från Elite Active 75t har de en blank yta (75t är klädda med gummi) och de blir därför väldigt hala vilket gör att de blir lite knepiga att få ur asken eftersom de sitter med magneter. Detta är väldigt små nackdelar och de uppvägs med råge av alla fördelarna.
Vill du ha ett par riktigt bra trådlösa hörlurar med grym ANC, riktigt bra ljud och trådlös laddning är dessa absolut ett mycket bra val.
We have been running an integrated pipeline for RSAT test for a while now but all of the testing has been focused towards stabilizing the tests and minimize noise and false positives. In doing this we have selected to run it against our SIT (OneBox) environment.
We have now decided to start testing for real in our UAT environment, which means that we need to redirect RSAT to connect to one of our Tier 2 environments instead. We will still use the SIT environment as a RSAT client but with an alternate configuration. This is a short step-by-step guide on how to create the new configuration.
Log into the RSAT client computer and start RSAT
Change the URL and SOAP URL to the:
URL: SOAP URL:
In LCS go to the UAT environment and export the RSAT Certificate. Remember to save the export password.
Upload the file and extract it on the RSAT client machine
Install the cer file to Local Machine – Trusted Root Certificate Authorities
Import the .pfx file (using the password above) to the local machine
more details are here: https://aka.ms/lcsRsatReadme
In RSAT – Setting change the certificate thumbprint to the one for your Tier 2 environment.
In order to use the new config but still be able to test against the SIT enviroenment, click Save As at the top of the page ang save the new configuration to a file. You can now manually switch between the two configurations and also use the different files in build and deploy pipelines.
That is all for today… Good Luck and happy testing 🙂
Back in 2018 I wrote an article on how to configure a D365FO instance to enable the Warehousing App. A lot of time has passed and tonight I will set up the all new and improved Warehouse App so I thought I would also take the time to update the original article
Go to the azure portal. In Azure Active Directory – App Registrations create a web application for the warehouse portal Name: WhatEverYouWant Who can use thisapplication or access this API: Accounts in this organizational directory only Application Type: Web app/API Sign-on URL: https://[theURLforyourdynamicsinstance]/oauth
Open the application to edit it
Verify Application ID. Save this for later…
Go to API Permissions and click Add a permission
Select the API called Microsoft Dynamics ERP (Microsoft.ERP)
Choose Delegated Permissions
Under Permission to other applications click add application and add Microsoft Dynamics ERP
Add the following permissions – Access Dynamics AX online as organization users – Access Dynamics AX data – Access Dynamics AX Custom Service Click Add Permissions
Click Grant admin consent for [Your Organization] and confirm your choice
Go to Certificates and Secrets and click New client secret. Select an expiration and give the key a description like D365FO Warehousing App
Log into Dynamics 365 for Operations and go to System Administration – Users and create a new user (in my case called WMAPP. The email address can be anything since it will never be used. The user needs this role: – Warehouse mobile device user
Now we need to associate the user to the AD Application which is done on System Administration – Setup – Azure Active Directory applications. Here we paste the App ID/ClientID from before and select the user we created. Click Save and you are done.
Install the App from app store and enter these settings: 1. Azure Active Directory ID: AppID/ClientID from step 3 2. Azure Active Directory Client Secret: The key from step 10 3. Azure Active Directory Resource: Your Dynamics 365 URL 4. Azure Active Directory Tennant: https://login.windows.net/yourAzureADtennant.onmicrosoft.com 5. Company: Dynamics 365 for Operations Legal Entity
To configure the mobile att you create a JSON file with the connection setting
When Microsoft upgrades your Dynamics 365 for Operations Production Environment to a Self-Service environment there are some changes that we need to be aware of.
We no longer have a lead time of 5 hours before the deploy starts which is great. Mostly in those cases where you, due to bad planning, need to perform two deploys back-to-back (Yes I know that Microsoft requires us to deploy everything in a single package).
The deploy maximum time goes down form 5 to 3 hours which is also awesome.
This part might throw you off a bit…
Prior to Self Service we went to the Asset Library and marked the package as a Release Candidate in order to deploy it to PROD. This has changed a bit:
a. Go to the UAT environment where you deployed the package b. Go to History – Environment Changes c. Select the package you want to deploy to PROD and click “Mark as release candidate” d. Go to the PROD environment and click Maintain – Update Environment e. Select the UAT environment where you changed the package to Release Candidate and the package will appear in the list. f. Schedule the deploy as usual (note that you can deploy immediately if you want)
One of my colleagues had a question today… His customer has split their company in two tenants and their Dynamics 365 for Operations was still in one of the tenant (TenantA). The users in the new tenant (TenantB) needed access and he wondered how they could do this.
The first step for doing this is to create an Azure AD guest account invitation in the Azure Portal (In Tenant A). Go to Azure Active Directory – Users – New Guest User.
The user receives an email where they can approve the invite…
When the user in Tenant B accepts the invite a “placeholder account” will be created in TenantA and this account will be linked to the users account in Tenant B. The user can not log in using his account in TenantB but will be treated as a user in TenantB and the security policies from Tenant A (MFA) will be applied. Once the user has accepted the invite we need to import the user in Dynamics. Go to System Administration – Users and select Import User. Search for the user and import it.
Tonight one of my colleagues called me having issues with validating the SSRS setup for an old AX 2012 R3 environment. Unfortunately it had been many years since I even touched a 2012 server which meant I had to turn to my trusted advisor… Google 🙂
The error he got was this: “Make sure that SQL Server Reporting Services is configured correctly. Verify the Web Service URL and Report Manager URL configuration in the SQL Reporting Services Configuration Manager.”
The problem is caused by UAC end there are 2 “solutons”
OK… So this is probably old news for most of you but I thought I would document this here mostly for me.
When you start a powershell prompt whether it being the old and battletested version 5.1, the brand new 7.x or if you are running the cool new windows terminal or my favorite VS Code you will get a couple of environemnt variables set by default. One of these are $env:PSModulePath. The problem is that they look a little different depending on which application you look in.
A couple of notable differences. PS7 uses PowerShell in your documents folder while PS5 uses WindowsPowerShell. If you have set up OneDrive with known folder redirection for My Documents the first path will instead start with C:\Users\[UserName]\[OneDrive folder]\Documents. Lastly version 7 also migrates the system wide folders for version 5.1 and adds them to the variable so your old Modules will be available there as well. You will see four system folders and one for the user. You might have issues with the OneDrive storage if you do not set the folder to “Always keep on this device”
If you are running VS Code the Powershell Addin will add its own folder called c:\Users\[UserName]\.vscode-insiders\extensions\ms-vscode.powershell-preview-2020.9.0\modules
Another notable thing with VS Code Preview is that you have two different consoles for PowerShell. One called PowerShell and one called PowerShell Integrated console, the first one runs version 5.1 and the second runs 7.
The all new Windows Terminal of course also has the option to run both version