AnsweredAssumed Answered

.geodatabase with large number of edits failing to sync

Question asked by mikedmanak on Feb 4, 2017
Latest reply on Feb 6, 2017 by mikedmanak

We have a transit efficiency app that occasionally needs to perform calculations for large numbers (200k - 1m) of records.  In the majority of circumstances syncing this database after doing a large calculation like this will result in a silent failure with no messages in the logs.  

 

In these cases I'm forced to retrieve the uploaded diff database and manually add the data to our SQL Server-powered SDE database.  Of course physically joining and updating fields in 200,000 records is also proving to be quite slow and unreliable, to the point where I generally have to delete and replace the entire record rather than update the fields.

 

It seems that perhaps there is a practical limit to what can be done via the sync process.  I'm curious if anyone at ESRI can enlighten me on what the limiting factors might be (memory, bandwidth, processor, etc...) and if any work has been done to identify these limitations.

 

Thanks

Outcomes