D365 F&O: Schedule backups using DevOps pipelines and database movement API

December 2, 2022 at 12:50
filed under Dynamics AX

Hi all,
We had a requirement to automatically create a backup of the database of a Tier 2 environment. Luckily there is the Database Movement API that allows you to do just that.

In his excellent blog post Automated backups of D365FO databases, Dick explains how to set this up using Release pipelines. We have done the same, but using (build) pipelines, so that’s what is documented below.

There are only minor differences. For example we use an output variable for pass the token between tasks, but for the rest everything is pretty much the same. So thank you Dick.

Creating the pipeline

A new pipeline is created:

Configure variables

Before you start, you need to create an application registration and have an account. See Step 1 to Step 3 on this page: Database movement API – Authentication.

After this is done, on your pipeline, click on the Variables tab. Add the following variables:
CLIENTID, CLIENTSECRET, GOLDEN (or ACCEPT, MIGRATION, etc., depending on your environment), LCSPROJID, PASSWORD, USERNAME.

CLIENTID This is the client ID from your application registration
CLIENTSECRET This is the client secret from your application registration
GOLDEN This is the Environment Id of the environment you want to create a backup for. You can find this on the Environment details page in LCS.
LCSPROJID You can find the project id in the URL when you are on the project in LCS. It’s an integer like 1234567.
USERNAME The email address of the service account you will use for this operation
PASSWORD The password of the service account

It should look something like this:

Don’t worry, all data is fake in the screenshot :).

Adding task Get token

  1. Next, go back to the Tasks tab, and click on the + button next to Agent job 1 to add a new task.
  2. Search for Powershell and click on Add to add a Powershell task.
  3. Click on your new task and name it Get token.
  4. Change the type to Inline and paste the following script:
$tokenUrl = "https://login.microsoftonline.com/common/oauth2/token"
$tokenBody = @{
grant_type = "password"
client_id = "$(CLIENTID)"
client_secret = "$(CLIENTSECRET)"
resource = "https://lcsapi.lcs.dynamics.com"
username = "$(USERNAME)"
password = "$(PASSWORD)"
}
$tokenResponse = Invoke-RestMethod -Method 'POST' -Uri $tokenUrl -Body $tokenBody
$token = $tokenResponse.access_token
Write-Host $token
Write-Host "##vso[task.setvariable variable=TOKENOUT;isOutput=true]$token"

Next on the Output Variables fast tab, set the reference name to task1.
It should look like this:

Add task Backup database Golden

Next well add the task that will perform a backup of our environment. Repeat the same steps as before to add the PowerShell task but name it Backup database Golden.
Set the inline script to:

$cstzone = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId( (Get-Date), ‘W. Europe Standard Time’)
$filedate = Get-Date $cstzone -f “yyy-MM-dd”
$BackupName = “Golderbackup-$filedate
Write-Output $BackupName
$refreshUrl = “https://lcsapi.lcs.dynamics.com/databasemovement/v1/export/project/$(LCSPROJID)/environment/$(GOLDEN)/backupName/$BackupName
$refreshHeader = @{
Authorization = “Bearer $(task1.TOKENOUT)
“x-ms-version” =2017-09-15
“Content-Type= “application/json”
}
$refreshResponse = Invoke-RestMethod $refreshUrl -Method ‘POST’ -Headers $refreshHeader
Write-Output $refreshResponse

Our pipeline will look like this:

Setting up the schedule

On the Triggers tab, click on Add in the Scheduled section. Choose you schedule and unmark the checkbox Only schedule builds if the source or pipeline has changed.
Click on Save & Queue, the click on Save.

All done

That’s it, your pipeline will create a backup for this environment based on your schedule. You should see your backup in the Asset library on LCS when the pipeline has run.

Notes

  1. You can only call the pipeline 3 times in a 24 hour period, so keep that in mind. This is documented on the page name Throttling.
  2. This is just a PowerShell script, so you can use the PowerShell ISE to test your scripts. Or if you have a different method of scheduling PowerShell scripts than DevOps pipelines, that will work too.
  3. The DevOps pipeline runs in who-knows-what country so the Azure AD administrator might have to configure some access policies to allow the account to be used.
  4. You can also use the API to Create a database refresh.

1 comment

RSS / trackback

  1. Kristof Meesens

    Thanks for sharing this Klaas!

respond