Power Automate – Update definition JSON programatically

When importing Flow packages into different tenants, the configuration often needs updating e.g. with a SharePoint trigger you might need to update the URL of the SharePoint site and the name of a list/library. This can be very time consuming, for example if you are deploying the same Flow to multiple customer tenants.

Wouldn’t it be great if we could run a script, capture some information from the user and have this update the Flow config for us meaning import is slicker and requires less effort. Well the good news is – you can! and I’ve provided a PowerShell script showing you how to do it.

My use case for doing this is further automating the deployment of a large Power Platform/SharePoint/Azure Logic apps based solution that I’ve built. I wanted to remove as many manual steps as possible. 

Ultimately a Flow just comprises of JSON so this is relatively easy to update if you replace any hardcoded values e.g. site URLs with placeholders. The file that needs updating is specifically the ‘definition.json’ file which contains the definition of the entire Flow. It should also be possible to update the connections dynamically too but that’s an exercise for another day! 

I’ve provided a PowerShell script 

The first step is to extract the zip file for your Flow. You will find the definition file in the following location – MyFlow_20191206132543/Microsoft.Flow/flows/flowid/

Next, update any hardcoded values (from the exported config) with placeholder values (see an example below) –

  "type": "ApiConnection",
                    "inputs": {
                        "host": {
                            "connection": {
                                "name": "@parameters('$connections')['shared_sharepointonline']['connectionId']"
                            },
                            "api": {
                                "runtimeUrl": "https://uk-001.azure-apim.net/apim/sharepointonline"
                            }
                        },
                        "method": "get",
                        "path": "/datasets/@{encodeURIComponent(encodeURIComponent('{SiteURL}'))}/tables/@{encodeURIComponent(encodeURIComponent('{RequestsListID}'))}/onupdateditems",
                        "authentication": "@parameters('$authentication')"
                    },

In the above example ‘SiteURL’ is the placeholder for where the URL to the SharePoint site used to live. ‘RequestsListID’ is the ID for the SharePoint list.

Now it is time to update the PowerShell script – 
https://gist.github.com/sharepointalex/2d3c8f4ad531788609fb39fed2406a76 as per your requirements. I just want to take a bit of time now to explain how the script works. The script performs the following actions – 

  1. Gets the file path to the definition.json file and the top level folder path (parent directory of the extracted flow).
  2. Uses the Get-Content cmdlet to open the file and using chained replace methods, replace the placeholder text with variable values.
  3. Uses the Compress-Archive cmdlet (part of the Microsoft.PowerShell.Archive module) to re-zip the files/folders.

Once you’ve updated/changed the script appropriately, simply run it and your new flow package should be created and ready to go.

Now when importing your flow, it will be pre-configured for the environment you wish to deploy it to. Especially useful if handing it over to a customer to import to their environment. You could easily add an automated import of the flow here using the Power Apps admin module/cmdlets – this is what I plan to do in my deployment script.

Granted a bit of work is required upfront to obtain the values for the new environment – site URLs, list IDs etc. but it saves a lot of work if the flow will be imported lots of times 😀👏

Now to try the same process for an exported Power App – watch this space!

Leave a Reply

Your email address will not be published. Required fields are marked *