Hi everyone!
I am working with large AGOL datasets and have been running into timing out issues with the AGOL API when performing large field calculations using model builder as well as manually performing "select", "join" then "calculate field". Model builder will time out during the calculation part of this process and force me to attempt to do the field calculation myself. For example, I am trying to update over 2,000+ records with work order numbers from another AGOL layer. We join the two AGOL layers and then use a field calculation to calculate the information over to those 2,000+ records. We have discovered that the limit for the AGOL API to perform large calculations are btween1,000 and 2,000 records before throwing the runtime error.
To counter this, we can bring some AGOL layers local, perform the necessary prep work with the model, then have the model join the prepped local layer with the AGOL layer where it can perform the final join and calculations. However, this is not fool proof and the model will occasionally still throw the runtime error. For some AGOL layers bringing them local is impossible. Either Pro will say that it is not possible to bring them local or the type of calculation would require a complete overwrite of the AGOL layer. Which we are against doing. We do not want to have to take the layers down, rewrite them, and republish them every week. I was wondering if anyone has found a solution for creating a model, or script, on how to perform batch calculations that grab the set number of records, perform the calculation, then move onto the next batch of records. What would the steps be for creating this process and work around the AGOL API limit?
We are running off of ArcGIS Pro 3.3.
At the risk of repeating an answer I gave top another user regarding batch updates via joins and calculations...
Please have a read of this post by Richard Fairhurst:
Turbo-charging data manipulation with python
Basic premise is using an update cursor to apply a dictionary you create based on a common value between your two datasets. If you scroll down to the section titled:
"Using a Python Dictionary Built using a da SearchCursor to Replace a Join Connecting Two Feature Classes"
...there is a sample code that will help you. I suspect you'll find the speed difference incredible!
I have some python scripts that do some calculations on AGOL data and can often run into the timeout issue.
I tried using some code that processed the data in 'chunks', but there seems to be no consistency of how many records before it fails.
So, I modified my script to keep track of what has been updated, and then keeps trying on the rest of the data until it is complete. Not sure if this will help, but Here is a post where I give example of how I'm performing this.
R_