Select to view content in your preferred language

Share and maintain Big Data hosted feature layers in ArcGIS Online

1030
0
02-02-2026 09:33 AM
BruceHarold
Esri Frequent Contributor
3 0 1,030

Here is my subject matter data, the State of New Jersey's parcel data (~3.5 million features, 4.4 GB with 45 fields), with thanks to the NJ tech team for their assistance putting this sample together.

I'll get to why one parcel is highlighted shortly...

ParcelsWithWeirdOne.png

First, with reference to the post's title, this discussion isn't tied to ArcGIS Online, you may be working with ArcGIS Enterprise and implement this workflow, so stick with me.  The challenge of maintaining a hosted feature layer of big data is common.

The problem we're trying to solve here is applying a data update to a live service when the update transaction is very large, in our case tens of thousands of parcel edits are written several times a year, the features may be point rich and the schema is wide.  If you overwrite the service using core ArcGIS Pro's user interface it consumes a session for a long time, so let's get some more efficient automation going using ArcGIS Data Interoperability and write only the delta transaction.

To be specific, the changeset writing mode recommended for larger transactions is upsert.  This requires the target feature service have a key field with a unique constraint, which the subject matter data has.  Upserts are sent in 10 MB chunks rather than sets of features with maximum row count supported by the service (2000 for polygon data).

Maintaining hosted feature services by applying a delta transaction as edits is a well-trodden path, change detection in ArcGIS Data Interoperability is ideal for this.  However, there are some things to note here:

  • The incoming data for the refresh is in file geodatabase format
  • The target workspace is a hosted feature service
  • The datasets are not co-located

This implies a few issues:

  • Streaming the hosted feature layer data locally to calculate the changeset would take a long time
  • Geometry, date and numeric fields need their precision in agreement for correct change detection
    • Subtle value differences require careful handling

Precision agreement issues can be solved with ETL tool configuration, but to avoid the issue entirely the approach we'll take is to download the target feature service as its own file geodatabase, so storage-dependent precision differences are not a factor.  Then the changeset can be easily calculated locally between two file geodatabase feature classes and the delta written efficiently.

This where the highlighted parcel in the map comes in.  Parcels may have complex geometry, boundaries may have multiple segments, and segments may be true curves.  While storing true curves in hosted feature layers is supported, editing them is constrained.  See here some relevant properties of my target feature service:

  {"allowGeometryUpdates" : true, 
  "supportsTrueCurve" : true, 
  "supportedCurveTypes" : ["esriGeometryCircularArc"], 
  "allowTrueCurvesUpdates" : true, 
  "onlyAllowTrueCurveUpdatesByTrueCurveClients" : true}

What you can take from this is that while some true curve editing is theoretically possible, any curves of type esriGeometryEllipticArc are not supported for editing, and guess what, a circular doughnut hole in a parcel has ellipse geometry.  Also, our ETL tool client is not known as a true curve client.

If you are using a release of ArcGIS Data Interoperability that does not support the Esri ArcGIS Feature Service writer you will need to use the feature service admin tools to set onlyAllowTrueCurveUpdatesByTrueCurveClients to false.

A simple way to manage true curves is to stroke them into polylines when doing geometry comparisons or when writing them to the feature service by using the ArcStroker transformer, with control over maximum deviation from the true curve.  This replaces any arc segments with polylines, temporarily for geometry comparison and permanently for any parcel written  that has been updated or is new.

Here are a couple of views of the workspace that does the whole job, first the Main view...

WorkbenchMain.png

...then the pale green looping custom transformer that waits for a file geodatabase export to complete...

WorkbenchLooper.png

The file geodatabase export takes a variable length of time, depending on how busy ArcGIS Online is.  I ran the tool at a scheduled time that worked out to 3AM UTC, it took 23 minutes for the export.  I have seen 10 minutes, or an hour, but I have also seen failures when testing at busy times for ArcGIS Online.  Scheduling the tool to run outside busy times in North America and Europe is recommended, so I used 3AM UTC.

To apply "defensive coding", just ahead of, and again inside the looping transformer which waits for export job completion, there are a couple of Emailer transformers that send job submission details and job failure details if that occurs.  Esri support will need both job and failure information to troubleshoot your service behavior on error, please do open a support call if you experience any problems.

Here is an example job details email body:

Feature service:

https://services.arcgis.com/FQD0rKU8X5sAQfh8/arcgis/rest/services/NJParcels/FeatureServer

Feature service export job:

9f77069e-e212-46bb-8696-b7ce4f54c882::FQD0rKU8X5sAQfh8

of service item:

4480efce4518473096613597d461e55f

to export item:

1fc913da28174608b9a65859bcc8b9b0

started at local time:

2026-01-26T07:03:23.5436242-08:00

Type is file and Size is 4823392256

Here is what an export failure message looks like (the translation will terminate):

Feature service export job has failed with status failed

Failure was at local time 2026-01-26T09:05:01.0944432-08:00

JobId was 9f77069e-e212-46bb-8696-b7ce4f54c882::FQD0rKU8X5sAQfh8

Status request response was

{"status": "failed","statusMessage": "failed","itemId": "4480efce4518473096613597d461e55f"}

The workspace is in the blog download, you will need to edit it for your ArcGIS Online credentials, feature service details and Emailer transformer parameters.

Please do comment in this board with your experiences and questions!

 

Contributors