Select to view content in your preferred language

Distributed collaboration (as copy) does not work when table is updated with SQL?

124
1
4 weeks ago
Jay_Gregory
Regular Contributor

I'm assuming this is unsupported, but wanted to make sure there is no workaround or I'm doing something wrong. 

I am creating a table in my enterprise geodatabase, enabling global ids and archiving, and sharing as a referenced feature service with sync enabled to my Portal. I do this all with ArcGIS Pro / Python.

When I share this feature layer with my distributed collaboration configured to send as copies, a hosted feature layer is created in my ArcGIS Online instance reflecting the data in my enterprise geodatabase.  That works as expected. 

However, this table is periodically updated outside ArcGIS - that is, with native SQL.  This process runs a truncate on the table, and repopulates with new data.  This process also generates new global ids to populate on load as well.  

This is where the distributed collaboration breaks.  
My logs say "failure in processing exports for Replica", "Failed to export data changes message for replica with Guid," "Failed to export data changes to replica". "Invalid column value [globalid]."

So I'm assuming something is happening with the globalid.  They look to be the standard format {8}-{4}-{4}-{4}-{12}, where the number is the number of characters (e.g. 52B2EBC3-DBA2-46C1-93F1-0D6DD52A2F13)

So two questions:

1. It is unsupported to maintain a distributed collaboration when the source table is maintained outside of ArcGIS

2. If not, is there a different process our DBA should follow so that the synchronization successfully processes?

 

1 Reply
jcarlson
MVP Esteemed Contributor

We've got a somewhat similar process here, but we do not use a distributed collaboration to update the hosted copy, as we have run into various issues (though not your specific problem) in the past. Still, I don't see why a collab couldn't take the updates, regardless of where they happen.

Our update process is:

  1. Use pandas.read_sql() to execute SQL and query the source table(s) directly.
  2. Make whatever modifications are needed to get the source data compatible with AGOL / Portal
  3. Either:
    1. Truncate / append to hosted layer (can use arcgis.features.GeoAccessor.to_featureset() to get the dataframe into an AGOL-ready object)
    2. Use either pandas.DataFrame.compare() or arcgis.features.GeoAccessor.compare() to compare source with hosted copy, identify the specific subset of rows that actually need editing

Not sure how feasible it would be for you to pursue a similar method, but it cuts out the middle-man of setting up a collaboration.

- Josh Carlson
Kendall County GIS
0 Kudos