Leveraging the Power Automate API to speed up PowerApps deployments

Earlier this year, I had the opportunity to work on a large scale PowerApps deployment at one of our clients as part of a project involving highly classified data transactions. Due to the sensitive nature of the data, the project’s architecture had to satisfy a number of security criteria, some of which created a tremendous increase in the time and effort needed to release our App and solution between environments on a regular basis.

Azure Directory SQL authentication vs traditional SQL auth

Our solution consisted of a PowerApp communicating with a SQL server via Power Automate, acting in this case as a REST API between the App “front end” and the SQL “back end”, performing CRUD operations (Create, Read, Update, Delete) instead of handling these natively in the App.

One of the requirements we had to follow was to that any connection to the SQL database had to authenticate using Azure Active Directory. In PowerApps and PowerAutomate, AAD authentication was implemented in 2019 alongside with SQL authentication and Gateway Authentication for On-Premise SQL servers

SQL Authentication Modes in PowerApps / Power Automate

The main difference between the SQL auth and the AD auth is that while SQL auth requires the user to enter the server endpoint, database and access credentials, the AD auth does not. Instead, for each CRUD transaction, the user is authenticated on the fly using his / her email address, and the server endpoint and database name are explicitly provided as hardcoded value when creating a Flow using the AD connection (For SQL Auth, you can use the “use connection” feature to use the server details provided during the connection creation).

Creating An AD connection does not require providing Server Details
Creating a SQL Auth connection requires server details

The other inconvenient aspect to AD authenticated connections is that (as of early 2020) the packaging experience in PowerApps, which allows exporting App and associated button triggered flows into a Package for future import into another environment, does not allow to specify the server details either. As a result, when a flow using the AD connection is released into the new environment, it is still very much pointing to the server is was pointing to in the origin environment.

The only solution is then to manually modify the hardcoded server details in the destination environment, which for a solution with a large number of flows, is very error-prone, time-consuming and in all honesty, mind-numbing.

The solution: one flow to Rule them All!

I was, however, able to create a workaround to this flow issue by using .. a flow ! The PowerAutomate API is in fact very powerful and allows to perform a large number of operations on your flows.

The first step is to specify the origin and destination server and database details – you will tell me: “but didn’t you tell us this whole solution is to avoid doing this ?!” – well the answer to this is that once these details are in, they are then used to update all of your flows, so in effect, this solution requires you to specify the details just once, rather than howeverManyFlowsYouWantToUpdate times.

Populating server details, ONCE!

The second step is quite simply to list the flows in your destination environment

Listing destination env flows

The next step is where the “magic happens”! It consists of a loop that updates the server details in all flows that need updating – let’s break it down into 3 parts.

Get details, Identify and Conditionally Update

Step 1: Get the details

First of all, in our use case, not all the flows were SQL flow needing to be updated, and therefore these would need to be excluded from the operation – if not, the flow would essentially hang trying to update server credentials on flows triggering logic apps… The best way for us was to identify the flow by its name, which in turns requires to get the details of the flows (the “list my flows” method only returns very high-level information on the flow which does not include the display name). Once the Get Flow step has run, a lot more parameters are available in further steps such as FlowDisplay Name and Flow Description.

Step 2: Identify

The second step following the step described above immediately is to then branch out into two scenarios:

  • when the flow is indeed a SQL flow, then do attempt to perform the update (left branch in the chart above)
  • otherwise, simply leave the flow as is (right branch).

This check is done by matching the Flow Display Name (which is available thanks to having called Get Flow before) and avoiding the update operation on select flows for which such update is not required.

Step 3: Update!

Finally, the last step is the update itself!

Updating the flow

Flows being essentially JSON objects, the Power Automate API allows to perform very useful operations: simply populate the inputs in the Update Flow step with whatever you want the new flow to contain: in our case, the environment, flow name, display name and state we do not want to change so we simply set them to the values we got from the Get Flow step – simple!

The flow definition is where it all really happens – and here is how: the general idea is to take the existing definition which contains the existing server and database details, and replace them with the new details. Quite fortunately, Power Automate expressions allow us to do just that. The only additional trick is to wrap the result of the replace instruction with a json function call so that the API accepts the updated content – all is all here is it :

json(
  replace(
    replace(
      string(body("Get_Flow")["properties"]["definition"]),
      triggerBody()["text_1"],
      triggerBody()["text_3"]
    ),
    triggerBody()["text_2"],
    triggerBody()["text_4"]
  )
);

The next steps with this wire this up to an App to properly automate the process: the app could have several buttons such as “Dev to Test” and “Test to Uat” and pass different sets of origin/destination server details depending on the scenario, that the flow would then consume to update the correct flows in the correct environment .. one for next time!

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.