I am at my wits end and would love some guidance. I have a cyclical process that I am trying to perform, and I keep running into errors with replicas stopping it.
1. Every night, ArcGIS Pro recreates a polygon layer by doing an SQL query of an external dataset and updating attributes of a polygon dataset of property parcels (the parcels themselves don't change, but the attributes do)
2. These data are pushed to AGOL via a python script that overwrites the web layer
3. People using Field Maps get up in the morning, update their offline maps and collect data (adding attachments to the polygon layer and changing values in the column that gets updated nightly)
4. After work, the data from the field collection are downloaded into our external (non-GIS) database.
The problem I'm getting is that I cannot overwrite the web layer with an updated version because I get errors that there are issues with replicas: "Service cannot be overwritten if Sync is enabled and Replicas exist." So, to do an overwrite, i have to go in and manually delete each replica.
Is there a better way to do this process so that I do not have replica issues? I need a way that each day people can add data to this dataset in the field (and offline), and each night, I need to download these data and update the dataset so that the next day the field workers see up-to-date info.
Maybe I could clarify more. My polygon layer stays the same. Every night ONE field gets updated (it declares which parcel polygons need to get new photos). To do this, my current method is to join a parcels layer with a .csv with an updated version of this field. Can I do that with a simple python script without tripping the replicas problem? If so, is there an example of this script I can modify? Thanks!
Oh yea if its one field just do a Join then a Calculate field. Would be like 5 lines of code. in ArcPro hosted services are interchangeable with local data in all the tools.
@DougBrowning, thanks so much for this tact.
I am currently trying this, but to say that this process is running slow, is the understatement of the century. I loaded the layer from portal in my Pro project (super fast). Did the join (also quick and easy). But calculate field has been 2 hours already. Yes, this is a big dataset (100k+ polys), but calc field on a local version of this dataset takes about 10 seconds with these data. Is this speed to be expected? Something I could do (given I DO need all the data in this dataset) to speed it up?
Are you hitting the Hosted service directly? Where is the join table located? If the join table is not local you could try a quick copy to memory in python at the top of the script then use that in your join (just a feature class to feature class or table to table to in_memory). That is often much faster.