Hi everyone,
I'm trying to update a hosted feature layer in ArcGIS Online that contains over 800,000 records and around 50 fields. The goal is to reflect these changes in a web map and, in turn, in a connected dashboard.
I initially tried using the "Update Data" > "Update Fields and Values" option directly in ArcGIS Online. Although the process completed without errors, the updates were not reflected in the attribute table.
I'm now attempting the update through ArcGIS Pro, using the Append tool, as recommended by Esri documentation. However, the process is quite slow, and I'm unsure if this is the most efficient way to update a layer of this size.
Has anyone successfully updated a similar layer? Are there any best practices for large-scale updates like this, especially without breaking the web map or dashboard?
Any advice on whether to split the data or adjust fields beforehand would also be appreciated.
Thanks in advance for any guidance!
If the data has a unique identifier that would connect the updates, you could write a script that would loop through your updates and apply them in patches of 2000. It would likely be a bit faster than pro's editing method, but is still going to be slow. AGOL API isn't meant to handle large updates like this in a single shot.
Hi @GerAcMa
It might be easier to copy the data locally to pro, update the feature records, and then republish rather than trying to use the existing service. Another thing, like @AustinAverill mentioned, is using a script, created in either Notebook or using the local python ide to run a script. With the script you can simply set it to run and then walk away.
What format of the update data are you trying to use (geojson, zipped file geodatabase, shapefile, etc.)?
As a test, can you create a test hosted feature layer and just try to add a subset of records to AGOL, maybe 2K records, and see if that works any better.
I have a much smaller data set (14K records) which using a manual update approach in AGOL took only about 1 minute.
If you have the Data Interoperability extension for ArcGIS Pro, there are several data update workflow options you can do: change detection, Overwrite feature service (without changing the feature service ItemId) using a AGOL feature service writer, use UpdateData operation (like you did) and adding validation.
Here's a couple of blogs from the DI community with sample solutions of the latter:
https://community.esri.com/t5/arcgis-data-interoperability-blog/building-a-data-driven-organization-...
https://community.esri.com/t5/arcgis-data-interoperability-blog/building-a-data-driven-organization-...
I've seen another user add in their Data Interop workspace a Swap Source operation on two hosted feature layer views (with separate hosted feature layers each) to keep the data fresh on one view while the other view is being updated.
Here's a webinar link from Safe Software using Swap Source solution for HFL Views with Data Interop and REST API:
https://fme.safe.com/webinars/geospatial-synergy-amplifying-efficiency-with-fme-esri-2/
You might look at the OverwriteFS script, available here:
https://arcgis.com/home/item.html?id=d45f80eb53c748e7aa3d938a46b48836
There is a sample notebook for using it here:
https://www.arcgis.com/home/item.html?id=cd5819375aca40a3a7f8b3a269404c2c
There is a tutorial here that shows how to use the script to update a layer on a scheduled basis:
https://learn.arcgis.com/en/projects/schedule-automated-near-real-time-data-updates/