What are the limitations of CMT and how can we go beyond?
D365/Power Platform projects often use configuration data to drive them. This is data that is stored in Dataverse tables that define/alter the behaviour of the system. For D365, that can be things like business units and SLAs.
This config data can be as important as any of the other customisations used in the system - If it’s not 100% correctly in place in a given environment, that environment will behave differently. The effects of this not being the case can be a massive time sink, with your team spending hours trying to understand why test environments are misbehaving, or a go-live risk with things working perfectly in your test environment and then failing in production!
The tables where this data is stored are often not solution-aware (cannot be added to a solution) so it’s not possible to version and deploy this along with the rest of your solution.
Microsoft provide a tool called the Configuration Migration Tool which is designed to plug this gap. You use a GUI to define a schema - which tables/rows should be exported and imported - and then the tool can export the resulting data to XML files and back into other environments. You can also do this in an automated way.
Sounds perfect!?
Nearly… except the CMT has some big limitations and issues.
Here are the most important limitations I’ve hit in the past when using CMT:
All of these gaps can be plugged with either a lot of manual effort (not recommended as manual processes fall apart!) or with some clever scripting to pre/post-process the XML data going in/out of CMT and the environment. This is an option, but it’s quite complex.
My solution to the limitations of CMT is to use my PowerShell module Rnwood.Dataverse.Data.Powershell to take control and do a similar thing:
How to test the process below
You can just open PowerShell locally and paste in these commands. You don’t need to run them in AzDO Pipelines etc.
How to use the process for real
You can put these commands into a .ps1
script file and run the script.
Firstly get a connection to the source environment. This is how when you are running interactively:
install-module -scope CurrentUser Rnwood.Dataverse.Data.Powershell
$connection = get-dataverseconnection -url https://org60220130.crm11.dynamics.com/ -interactive
Now you can export each table you want to include like this:
Get-DataverseRecord -connection $connection -TableName contact | Set-DataverseRecordsFolder data/contacts
In this example, I’m using contact
table, because everyone will have it to try, but this is a less likely
When this is executed, you will see a folder with the name data/contacts
created inside the current working directory with a JSON file for each record:
The filename is the primary key of the record.
Inside those files looks like this (I’ve cut it down to make it easy to understand)
{
"Id": "2c8c3463-01af-ef11-b8e8-7c1e52038f2b",
"TableName": "contact",
"contactid": "2c8c3463-01af-ef11-b8e8-7c1e52038f2b",
"emailaddress1": "some@other.com",
"firstname": "Some",
"lastname": "Other"
}
There’s a property per column, and the value for this record is there.
Columns
By default Get-DataverseRecord
will include all non-system columns, but you can override which columns are included with the parameters like -Columns
or -ExcludeColumns
. See the documentation for this command for more info.
Filters
By default all records from the table will be included. There are various options like -Filter
, -Join
, -Id
, -Name
and -FetchXml
to limit this. See the docs . If you are doing a really complex query, you could use the Invoke-DataverseSql
command.
You can add more lines for each table you want to include. Just don’t forget to change the table and folder names.
This step is up to you depending on your desired end to end process. You can take the data
folder full of JSON files and store it somewhere for later use when you want to deploy to each environment.
When you are doing this in a AzDO/Github Pipeline, I’d recommend committing it to source control and then outputting the files to your build assets.
Now you’ll need to get a connection to the target environment. This is how when you are running interactively:
install-module -scope CurrentUser Rnwood.Dataverse.Data.Powershell
$connection = get-dataverseconnection -url https://org60220130.crm11.dynamics.com/ -interactive
Now you need a line for each table to read the JSON files and create/update a record:
Get-DataverseRecordsFolder data\contacts | Set-DataverseRecord -Connection $connection -Verbose
If you have multiple tables that depend on each other, you will need to put them in dependency order.
When you run this, the -verbose
parameter makes it output some info about what happened. It will tell you which records got created, which records got updated, and which records did not need updating:
How existing records are matched
By default, Set-DataverseRecord
will match on the primary ID of the record to find existing records to update. You can use the -matchon
parameter if you want to match on other columns. See the docs for more info.
Error handling
Unlike CMT, you can stop when there’s an error. Annoyingly, it’s not the default in PowerShell though! All PS scripts should usually have this at the top:
$ErrorActionPreference="Stop"
So now we’ve got something equivalent to CMT. But that doesn’t solve some of the challenges we mentioned at the start. How can we modify the records as they are being imported?
Get-DataverseRecordsFolder data\contacts | ForEach-Object {
$_.firstname = "Changed it"
$_
| Set-DataverseRecord -Connection $connection -Verbose
This example does a really simple static replacement:
ForEach-Object
loop receives each record as a variable $_
and we can do anything we like with it. The properties on this variable correspond with what you can see in the JSON file. $_
on its own line) if we want the record to be processed. We could do this conditionally if we only want certain records to be included. Of course, a real use case could do anything here. For instance, reading the environment-specific values from parameters/environment variables.
Using your own PowerShell script instead of CMT doesn’t have to be hard. Once you’re you’ve got a helper and you’re over the little bits of learning, you can unlock the ability for it to be far more complete, flexible and reliable.
Give it a go and let everyone know what you think!