I have a large feature layer hosted on ArcGIS Online. The layer has (or rather will have) a large amount of attachments over time and I am looking for the best way to create a daily local backup of the feature layer and attachment data. Because of the amount of data a simple export and download is not very practical (estimated size of 90GB at the end), at least not daily.
I looked into replicas and tried to set up a synchronization, but ran into problems. Setting up the replica as a file geodatabase seems to only work without synchronization, so its basically the same as an export. Setting it up as a JSON replica seemed to work, but I couldn´t figure out how to import the data into ArcGIS Pro/Map. It seems the JSON Replica is not a valid ESRI JSON. More or less the same with the SQlite variant.
Has anyone dealt with that scenario before and has some advice? As I said it is just a one-way backup so synch capability is not important. I just want to a local backup, that is done incrementally so I don't have to download the same data over and over again.
Any help is appreciated.
Hi Alexander Zeller,
A few ideas based on your post:
Have you looked into some of the conversion tools in Pro? These might help with the data import for the JSON replicas:
Another idea could be to use the FGDB export option with an attribute query in the LayerQueries parameter to obtain only features that were edited within the last day.
Also if you have ArcGIS Enterprise you can sync deltas to a copy of the service with distributed collaboration. The attachment structure has been modified at 10.8.1 so attachments are copied from Online to Portal with the sync.
Hope this helps,
I'm trying to implement a similar back up schedule for large hosted feature layers in AGOL. I've been looking into replica and the Pro file geodatabase to filegeodatabase tool but these options don't handle large datasets that well. The issue is the amount of time the export takes to download.
Did you manage to get a process that worked ?