How to get connected and unleash the power of scripting to your build/release/deployment pipelines
Why use PowerShell in AzDo Pipelines? I covered this in a previous post.
So now you know that PowerShell can give you superpowers in your Power Platform deployment pipelines, how do you get started?
This article assumes you have already installed Power Platform Build Tools and connected to your Power Platform environment using a Service Principal stored in an Azure DevOps Service Connection. Find out how here if that is not the case.
We are going to use YAML Pipelines. YAML Pipelines might look harder than Classic/Release Pipelines (which is the type you will see being mentioned more often in the PP space) because you have to define them using code rather than a designer, but they come with many advantages once you can get over that initial learning. If you’re not ready for this jump yet, most of what I’ve written here should still be applicable - you will just need to find the corresponding actions and properties in the designer.
Microsoft say:
If you are new to Azure Pipelines, it is recommended to start with YAML pipelines.
Creating an empty YAML Pipeline is easy. Just pick the option to create a new pipeline and look for the option to create an empty one ignoring all the templates.
Including PowerShell in a Pipeline is easy, you just need to use the pwsh
shortcut for one of your steps:
- pwsh: write-host Hello world
displayName: Say hello using PowerShell
If you want multiple lines of script, you can write this as such:
- pwsh: |
write-host Hello world
write-host Hi again
displayName: Say hello using PowerShell
The script starts from the line after |
until the indentation goes back to match the start of pwsh
or beyond.
We’ll cover how to put your script in a file instead of inline another time.
Confusingly there are two different things called PowerShell that we can use. This is down to the history of PowerShell.
More information about this is available here.
powershell
step shortcut will run this (on a Windows build agent): - powershell: write-host Hello world
displayName: Say hello using Windows PowerShell
pwsh
step shortcut will run this: - pwsh: write-host Hello world
displayName: Say hello using PowerShell
So which should we use?
Whenever possible, you should use PowerShell (pwsh
) and not Windows PowerShell (powershell
). This is because this is the version that is being maintained long-term. Windows PowerShell will go away eventually.
Why would we ever use Windows PowerShell at all then?
Some PowerShell modules have not yet been converted to work with the newer PowerShell. If you need to use these, then you’ll have to use the older version if you can’t find an alternative. An example in the Power Platform space is Microsoft.Xrm.Data.PowerShell . Luckily for interacting with Power Platform (Dataverse) data, there are other choices including my module Rnwood.Dataverse.Data.PowerShell which we’ll be using.
Build agents used by Azure Pipelines only have certain software pre-installed. So we need to make sure both the Power Platform Build Tools and our PowerShell module are installed.
Installing PS modules is easy:
- pwsh: |
install-module -scope currentuser Rnwood.Dataverse.Data.PowerShell
We also need to install the Power Platform Build tools using the special task that is provided:
- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.tool-installer.PowerPlatformToolInstaller@2
displayName: 'Power Platform Tool Installer'
This needs to go above our pwsh
step because steps run in the order listed.
Now, it’s time to dig out the name of your AzDO Service Connection. This is the secure way of storing connection details with credentials in AzDO. Jump back to the top of this article for a link if you don’t have one.
The first task we need to include is part of the PPBT. It reads the service connection and stores the bits we need into variables in the pipeline for us to use later. This should be inserted after the tool installer:
- task: microsoft-IsvExpTools.PowerPlatform-BuildTools.set-connection-variables.PowerPlatformSetConnectionVariables@2
displayName: 'Set Connection Variables'
name: connectionVariables
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: 'My Service Connection Name'
Next, we need to use the variables that this has set to get connected in our PowerShell script. Let’s edit our pwsh
task:
- pwsh: |
install-module -scope currentuser Rnwood.Dataverse.Data.PowerShell
$connection = Get-DataverseConnection -url "https://someenv.crm11.dynamics.com" -clientid "$(connectionVariables.BuildTools.ApplicationId)" -clientsecret "$(connectionVariables.BuildTools.ClientSecret)"
Finally, we’ve got a connection! Time to use it.
- pwsh: |
install-module -scope currentuser Rnwood.Dataverse.Data.PowerShell
$connection = Get-DataverseConnection -url "https://someenv.crm11.dynamics.com" -clientid "$(connectionVariables.BuildTools.ApplicationId)" -clientsecret "$(connectionVariables.BuildTools.ClientSecret)"
# List all users
Get-DataverseRecords -connection $connection -TableName systemuser
Run your pipeline, and if all goes well, you should see all the list of users output in a list to the output.
The $connection
object returned is an instance of the Dataverse SDK ServiceClient class . You can do anything the SDK allows, but the PS module makes it easier, especially since PS provides many helpers.
For instance, you could insert a list of contact from a CSV file like this:
Import-Csv contact.csv | Set-DataverseRecord -connection $connection -TableName contact
The |
is called the pipeline. It passes the output of one command into another. So the rows that comes out of the CSV file, get fed into the set record command, which will create a new record.
Find out what else you can do in the Rnwood.Dataverse.Data.PowerShell docs .