The Power Automate KPI connector allows you to save KPI Actual or Target data to your PIPware instance. This is an instructive example to show you how you would connect the connector to a source SQL database. However, the source could be any other Power Automate Connector of your choosing. If you would like more information on the PIPware connector, such as throttling limits or the API datatypes, please also read the documentation here: 


What you will need

  1. A Power Automate Premium license. We recommend assigning the license to a service account. It is also advisable to  use the same account for a PIPware user too. That way you can use that user's API key together with Power Automate without worries of the details changing if someone leaves.
  2. A PIPware user with KPIActuals and/or KPITargets permission. As mentioned above, we recommend using the service account details for the PIPware user.


Use case

This document illustrates configuring the following flow:

  1. KPI data is inserted into a source database table
  2. This automatically triggers the Flow set up in Power Automate
  3. The first step of the Flow fetches the data from the source table
  4. The data flows to the second step, the PIPware KPIs connector
  5. The PIPware KPIs connector sends the data to your PIPware instance e.g.

The final flow will look like the below:

Setting up the Source

Below we have created a database called “testdatasource” and have created a table that will contain the source data. Our flow will be triggered when data is inserted into this table. This is one of many possible examples., Power Automate provides a variety of Connectors to trigger flows or fetch SQL data, such as executing a stored procedure to fetch the data.

 We recommend using a stored procedure as this is better to maintain and test. SQL also optimises the stored procedure by caching a query plan, which will result in faster execution times. For this example, however, Power Automate will fetch the data directly from the database table.

Installing the Data Gateway

For Power Automate to have access to source database, you must install and configure the Data Gateway. The documentation and download can be found here:

Please make sure that when setting up the Gateway to take note of the following:

  • The email address that you login to the gateway with must be the same one that has the Power Automate premium license that you are using to create\ your flows.
  • The gateway must sit on a server that permanently has internet access
  • The region for the gateway must be in the same region as your Power Automate environment. Documentation and the mapping table between Gateway regions and Power Automate regions can be found here:


The below screenshots take you through how it is configured.

 Select an installation directory and accept the terms

Supply the same Account that is used for your Power Automate license. As mentioned earlier in this document, we recommend using a service account that is not tied to a particular individual

You will then be promoted to signin using that account.

Select “Register a new Gateway”

Change the region to match your Power Automate environment region. Instructions on finding your Power Automate region and mapping it to the gateway region can be found here:

Also, insert a name and recovery key

In this example we have changed the region to East US to match our Power Automate environment:

 You will see the following screen if the configuration was successful. You are done and can click close

Configuring the Flow in Power Automate


In Power Automate create a new Automated Flow


Give the Flow a name and select the “When Item is created V2” SQL trigger


On the first step, click the menu and select “add connection”. We will now connect this Step to our source database, through our gateway


From the Gateway dropdown, select the Gateway we installed in the previous section. If you do not see it in the list make sure the Gateway was configured using the same account and region as your Power Automate environment, as detailed previously in this document.

Fill in the rest of the details to connect to the source database and click Create. Use credentials for a SQL or Windows user configured on the database that has access to the table we want to read from. If you do not have any credentials or do not know how to set them up, ask for assistance from your database administrator.


If the previous step succeeded you will then be directed back to the flow. Select the Server, database and table from the Dropdowns. If it does not automatically load all the tables in the dropdown, choose “enter custom value” and type the table name as shown below.

Click “Add a new step”, search for “PIPware” and add the “Save KPI Actuals” action. You could also save PIPware Targets, but in this example we are only going to work with Actuals data.


You will then be asked to enter information to connect to the PIPware instance that you would like to save the data to. Fill in all the details. The API key is the API key of a user on PIPware with the KPI Actuals permission. We recommend using a service account in PIPware. If you have lost your key you can regenerate a new API key from the Profile page, but this will cause any existing integrations to break.


Using the dropdowns, assign the values that come from the SQL Connector to the corresponding fields that will be sent to PIPware. Configuring the date will be shown in the next section.

For the date field, you will notice a Dynamic SQL option is not automatically appearing in the dropdown. This is because an API specification can only receive dates as strings in a particular date format, but the entity coming from our SQL datasource is a DateTime type.

We must map the DateTime to the correct format using an expression.

 We have entered the expression



For the “deleted” field  you can choose a default value that will always be used. Alternatively, to find the source in the SQL data choose “enter custom value” to search the Dynamic Content, just like KPICode and ActualValue was done


 You can test the flow by clicking the Test button. Note that testing will save real data to your live instance.

From there you can choose to perform the trigger action (inserting data into the source table). The “Use data from SQL server” option will use whatever data already exists in the source table. WARNING: by running any test you will be saving data to your PIPware instance.