Setting up connections references with with Azure DevOps pipeline
Power Automate Flows has become one of the most popular tools for data flow automation and building simple integrations between Dataverse-based applications and other services in the last few years. Power Automate (named “Microsoft Flow” in the past) was designed as a tool for citizen developers to create simple automations and soon evolved into a platform that can be used for implementation of much more advanced scenarios in the enterprise environment. Unfortunately, because of the service’s original purpose, there are still some things we should be aware of when implementing advanced Power Automate-based functionalities and CI/CD scenarios. In this article, I’ll briefly describe how Power Automate flows should be moved between environments using Azure DevOps pipelines and Dataverse’s solutions.
The first thing we need to discuss is the relationship between flows and connections. Right now, it is possible to store Power Automate flow inside the Dataverse solution. On the other hand, it is not possible to add Connection to the solution (or, basically, any other Dataverse deployment artifact). Connection References are solution components that connect flows and connections. I like to think of these as a kind of “pointers” to the connections. We are allowed to add Connection References to the solutions so we can export them from the source environment and import them into the target one.
An example of a moving solution, including flows and related connection references, has been visualized on the screen below.
We can see that two flows and references to the Outlook connection are part of the solution, and they can be moved between the source and target environments as part of the solution’s export/import process. However, because connections themselves are both unique, they need to be setup manually in every environment. So, to complete the import process, we need to point out a reference to the valid connection in the target environment.
The process may seem very easy in the case of simple solutions (having 2 or 3 connection references), which are moved manually between environments. In real-life scenarios, however, we may have artifacts that contain many more references that need to be setup automatically during the deployment process. Here comes Azure DevOps pipelines to the rescue!
The process of automating connection references setup inside DevOps pipeline will be described in the example below. Let’s imagine we have a very simple flow for sending email notifications every time a new task record is created in CRM.
We created it in a development environment, and right now we would like to release a solution containing the mentioned flow into a destination environment where different accounts are used for Dataverse and Outlook connections.
The first thing we need to do is set up an Outlook connection in the destination environment. To do this, we just need to open the Connections section of the Power Apps Maker portal and create a new connection.
After saving your connection, we’ll need to open its details and save its ID. We will need it for mapping file configurations used by Azure DevOps tasks. Connection ID may be copied from the Connection Details page URL.
https://make.powerapps.com/environments/my-environment-id/connections/shared_office365/my-connection-id /details
Another important thing is to share the connection to the Azure DevOps application account. This is the account Azure DevOps service uses to connect to the Dataverse and do all the magic related to importing new solutions versions. It is because during the solution deployment process, Azure DevOps will try to set a solution connection reference to point to our connection and only connection’s creator has an access and is able to use it by default.
This is very important action because, without proper access to the connection, Azure DevOps pipeline will fail with the strange looking errors like: “An unexpected error occurred” 😊.
Similar actions should be performed for the Dataverse connection in the target environment. The only difference (and actual best-practice approach) is that it is possible to use application account credentials for the configuration of the Dataverse connection. Again, we should copy our connection ID and share it for DevOps application account usage.
Let’s investigate our deployment pipeline right now. It should take the Dataverse solution from the defined location (usually it should be build-pipeline artifacts; for the purpose of this example, I will put the solution file directly inside the code GIT repository) and import and publish it in the target environment. During this operation, it should map connection references from our solution into connections that exist in the destination environment. Information about mappings should be stored inside the so-called deployment configuration file. Let’s take a quick look at what it may look like in our example.
{
"EnvironmentVariables": [],
"ConnectionReferences": [
{
"LogicalName": "odx_sharedcommondataserviceforapps_aa2bc",
"ConnectionId": "",
"ConnectorId": "/providers/Microsoft.PowerApps/apis/shared_commondataserviceforapps"
},
{
"LogicalName": "odx_sharedoffice365_d1766",
"ConnectionId": "",
"ConnectorId": "/providers/Microsoft.PowerApps/apis/shared_office365"
}
]
}
The solution file may be generated automatically with PAC CLI tools. Detailed instructions on how to do this may be found on the Microsoft Learn site.
“Environment Variables” section is not relevant to our case, so let’s skip it for now. “Connection References” is the only section we should be interested in. We may see two entries inside, representing connection references stored inside our solution. Every connection reference is described by its name (“LogicalName"), the connector’s identifier it is destined to, and the connection’s ID, which should be filled with the identifiers of our connection inside the target environment.
Let’s commit this file to our repository and try to use it inside the DevOps release pipeline.
- task: PowerPlatformImportSolution@2
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: 'OneDynamics Demo Environment'
SolutionInputFile: 'BlogPost_1_0_0_0_managed.zip'
UseDeploymentSettingsFile: true
DeploymentSettingsFile: 'deployment-settings.target.json'
AsyncOperation: true
MaxAsyncWaitTime: "60"
HoldingSolution: false
PublishCustomizationChanges: true
After running the DevOps pipeline with the above-mentioned task, our solution should be imported, and all the connection references inside should be successfully mapped into connections available in the target environment.
All the artifacts described in this text may be found in our GitHub repository.