Using ArcGIS Connectors for Power Automate to get Custom Geometry in Power BI

6717
18
02-10-2022 07:00 PM
SeanKMcGinnis
Esri Contributor
7 18 6,717

Many customers have shared how they would like to bring their features into Power BI reports but have not been able to - until now.

The recent release of the ArcGIS Connector for Power Automate, users can now reference hosted feature layers and convert them to CSV files that can be added to Power BI reports. Flows can be scheduled or  manually triggered to create a CSV file to support your business intelligence needs in Power BI.

In this example, the flow is manually triggered and immediately creates a collection of variables to support the iteration through all the features in the feature layer.

Manually triggering the flow and creating three supporting variablesManually triggering the flow and creating three supporting variables

Using the Initialize Variable action, the flow creates three variables supporting different aspects of the flow. The three variables are:

  • Offset - an integer variable used to track which record the flow is working with. It supports the paging through records and keeps track of the count when there are more than 150 records and another call has to be made back to the service.
  • Header - a string variable representing the column headers which will be used in writing the CSV file
  • Data - a string variable that will hold the geometry and attribute information for the records

Next step in the flow is to get the data from the feature layer and iterate through all the records. To do this, use the Do until action to wrap the multiple records returned from the Get data from feature layer action.

Screen Shot 2022-02-09 at 12.58.10 PM.png

The Get data from feature layer action needs a few user defined elements:

  • where to discover layers in the users ArcGIS Online Organization for the Layers in dialog, options allow for selecting from the user's content, the broader organization, items the user favorited, or from a custom value
  • which layer to select from Feature layers input
  • how to format the data in the Output format input - in this scenario, CSV
  • the Starting count to track where it is within the result set
  • what records to return using a Where statement. To get all of the results use '1=1', for more information on forming Where queries, review the ArcGIS REST API documentation
  • Set the Return geometry input to 'Yes' to get the geometry to display in the Power BI report

Now that the flow has data to work with, set the limit for the Do until. Click the Add dynamic content link in the Do Until action and select Dataset has additional data and set the conditional modifier to is equal to and define the modifier as no

Screen Shot 2022-02-10 at 9.51.44 PM.png

We are not done working in the Do until action - we need to set and append the variables defined at the beginning of the flow.

Set the Offset variable to the Current offset of the data

Screen Shot 2022-02-10 at 1.24.09 PM.png

Setting the Header variable requires the flow to split the output from the data using the first row to get the column headers.

Screen Shot 2022-02-10 at 1.24.58 PM.png

The value expression for setting the Header is:

 

split(outputs('Get_data_from_Feature_Layer')?['body'],decodeUriComponent('%0D%0A'))[0]

 

 

The Data variable is appended using an Append to string action replace function with the output from the dataset. This adds the geometry and attribute for each record  which will be written to the resulting CSV file.

Screen Shot 2022-02-10 at 1.25.10 PM.png

The value expression for setting the Data variable is:

 

replace(outputs('Get_data_from_Feature_Layer')?['body'],variables('Header'),'')

 

After the Do until action iterates through all the records, the flow will need to concatenate the header and data a write to to a file. To concatenate the Header and Data, the flow creates a new Result variable using the Initialize variable action and sets the value using a concatenation expression:

Screen Shot 2022-02-10 at 9.41.02 PM.png

 

concat(variables('Header'),variables('Data'))

 

The final step is to write the Result to a CSV file using the Create file action. 

Screen Shot 2022-02-10 at 9.00.09 PM.png

The Create file action defines:

  • the folder path where the file will be saved
  • the name of the file to be saved
  • the file content, which is the Result variable that concatenated the header and data  

In the end, the flow looks like:

Screen Shot 2022-02-10 at 9.21.24 PM.png

...and the visualization in Power BI looks like:

GeorgiaCounties.png

 

Tips & Lessons Learned

  • It is always a best practice to name the actions, but depending on the action name, the reference in the downstream actions could change.
  • There is a 32,767 character limit in Excel and Power BI fields. In testing, feature layers with large or complex geometry have exceeded the maximum number of characters and forced the record to overflow into a new line. If you notice missing features in the map, check the CSV file in Excel and see if it created a new row. If you do encounter this issue, you can go into ArcGIS Pro and generalize the feature class to reduce the number of vertices and republish the service

Thanks in advance and please let me know how this works for you or if you run into any issues.

18 Comments
SteveHalstead
New Contributor

This is brilliant, thanks!

SeanKMcGinnis
Esri Contributor

Thanks @SteveHalstead - please let me know how it works for you and if you have any comments or feedback.

TimMinter
Occasional Contributor III

Thanks for the detailed instructions @SeanKMcGinnis.  Can I expect this to work with Power BI Desktop and our on-premises ArcGIS Enterprise deployment with no access from outside our organization's domain?  All the documentation I've seen calls out "ArcGIS Online," so I'm expecting that I cannot use it with our ArcGIS Enterprise deployment.

SeanKMcGinnis
Esri Contributor

Good day @TimMinter - the current deploy does require ArcGIS Online. The team is working on finalizing a connector that would work with ArcGIS Enterprise. The new connector would work in more isolated environments and we are planning to deploy it to Microsoft early this year.

AW_Grove
Occasional Contributor

Does anyone have best practices of where to store the csv exports for use in Power BI? I want to stay away from OneDrive since that is tied to a person. SharePoint seems the best place, but was curious if anyone found a better option.

TimMinter
Occasional Contributor III

@AW_Grove What about adding them to an AGO organization or to an AGE portal?  Without knowing anything yet about how the upcoming connector for ArcGIS Enterprise works, my first solution idea (which might be discarded quickly) is:

  • Add "make CSV for Power BI" to the tail end of our data update ETL procedures (ugh.  so much ETL.  Why can't an ArcGIS map service just cough it up?)
  • Update the CSVs in portal and tickle the item dates as needed to communicate freshness accurately.

Gah.  Second solution idea...  Koop

idk,

tim

SeanKMcGinnis
Esri Contributor

@AW_Grove - I am not sure if there is a best practice when it comes to getting the data to an anonymous source, but I think a SharePoint directory would probably be a good destination for the file. You can easily control the access and not have it tied to an individual as the progress through their roles.

@TimMinter - while the files can be published there as items, Power BI is unable to read from ArcGIS Online or Enterprise as a data source. One of the difficulties when working across multiple platforms is accommodating the security and file constraints. Microsoft has certain considerations when accessing data for their reports and they do not accept remote files. To complete the workflow you are suggesting, Power Automate is a good tool that can execute a flow and get data from services on an event (like an update) or schedule with the actions to refresh the Power BI data to support the report data's freshness. You would have to consider the update frequency considerations of Power BI datasets and balance them against your data currency SLA, but it is an option to reduce the manual ETL processes.

AW_Grove
Occasional Contributor

Thanks @SeanKMcGinnis ! And I feel ya @TimMinter - it would be great if there was an easier way to get the data securely.

 

@SeanKMcGinnis - if you have a feature class over 60k records, is it best practice to query your dataset in AGOL to not exceed the Power Automate and then combine the data on the backside through a dataflow in Power BI? Or is there a way to start the 'Do Until' Loop again to go through until you have all 100k, 500k, 1M, etc. records?

 

********** Update ******** 

I don't know how I missed this, or what the limits are, but you can change how many times Power Automate does the 'Do Until' loop. I have it on 1000 and its working (but my feature class only has 70k records). The default is 60 loops. 

AW_Grove_1-1675803555953.png

 

 

AW_Grove
Occasional Contributor

@SeanKMcGinnis  - Hi Sean, we have been using this successfully for almost 2 weeks and then the AGOL update seems to have broken our process (or maybe something else, just coincidence in timing) . We are now seeing errors like below, even though the data size hasn't changed from last week. Any ideas of what is going on? The issue started on 2/22 as far as I can see from our failure logs. 

AW_Grove_0-1677195318685.png

 

SeanKMcGinnis
Esri Contributor

@AW_Grove - I am sorry you are running into this and that is a message I have not seen before.

What triggers/actions are you using that are returning that message?

AW_Grove
Occasional Contributor

Hi @SeanKMcGinnis  - We also had big problems with taking maps offline in Field Maps last week after the AGOL upgrade. I think this was related since I made no changes and everything was resolved Monday morning. 

CoreyWilliamsESRIAccount
New Contributor II

Hi @SeanKMcGinnis , thank you for this post.  It was very easy to follow and export  our org's data for use in Klipfolio. 

I am exporting data from a Survey123 hosted feature layer and multi-line text questions are splitting onto next rows.  Is it possible to select the fields the exported?  Or is there a way to concatenate the iterations as JSON to use in Klipfolio?

CoreyWilliamsESRIAccount_0-1680720177395.png

SeanKMcGinnis
Esri Contributor

I have not encountered that issue before. When you say you are exporting them, how are you exporting them? Are you using Power Automate?

nikos
by
New Contributor

Hi @SeanKMcGinnis ,

Thank you for the very helpful, detailed instructions.

I am looking to develop a solution that will likely use the steps above (don't currently have an ArcGIS Online Organization account or a hosted feature layer, but will create one if the solution proves to be fit for purpose). I am trying to check if the solution is fit for purpose in terms of  how the exported CSV appears in the Power BI visualisation (and potential formatting to that).

Would it be possible to provide one CSV file with an exported layer (shapes and dummy data) as shown in your example that I could use within Power BI desktop to check the visualisation?

Thank you in advance.

alo5108
New Contributor II

Any update on this being available for ArcGIS Enterprise?

I have a custom connector built for Survey123 in Power Automate but am not sure one can be built to update features.

TimMinter
Occasional Contributor III

@nikos@alo5108 - several community members have come up with a partial solution for getting ArcGIS Enterprise services into Power BI.  Power BI has a technical constraint in that it cannot handle more than ~32,000 characters in a single cell value.  We frequently realize this constraint when we supply polygon coordinate pairs that exceed the max value.  So, sometimes not all of the polygons draw and then we have to look for ways to get the coordinate pair characters below the max value (generalize, limit precision, hope, etc.).  This info might be useful to you:

cheers,

tim

EnvironmentalCompliance
New Contributor

@SeanKMcGinnis , I have created a flow using the essence of how yours is set up - it pulls data, creates a table summarizing the data from a view layer and emails the tabular data from the feature layer.  

I've setup the "Do Until" in the same manner with the counter and parameters, however it appears that the Do Until runs several times over before stopping.  Is there anything obvious to what I'm doing wrong?

EnvironmentalCompliance_0-1699553515448.png

 

GBacon
by
Occasional Contributor

I'm trying this for the first time and it's not letting me past the Initialize Variable after the append because the initialization has to be done at the top level of the flow. Has something changed since this post? 

GBacon_0-1707506053074.png

Also, if I'm writing to JSON instead of CSV, do I need the header followed by append, concatenate, etc.? I appreciate your help.