Select to view content in your preferred language

.geodatabase with large number of edits failing to sync

1627
4
02-04-2017 12:05 PM
MichaelDavis3
Frequent Contributor

We have a transit efficiency app that occasionally needs to perform calculations for large numbers (200k - 1m) of records.  In the majority of circumstances syncing this database after doing a large calculation like this will result in a silent failure with no messages in the logs.  

In these cases I'm forced to retrieve the uploaded diff database and manually add the data to our SQL Server-powered SDE database.  Of course physically joining and updating fields in 200,000 records is also proving to be quite slow and unreliable, to the point where I generally have to delete and replace the entire record rather than update the fields.

It seems that perhaps there is a practical limit to what can be done via the sync process.  I'm curious if anyone at ESRI can enlighten me on what the limiting factors might be (memory, bandwidth, processor, etc...) and if any work has been done to identify these limitations.

Thanks

0 Kudos
4 Replies
DavidLednik
Frequent Contributor

Hi!

I did not observer this before but just to clarify.

If you have a GDB with 300k points does that mean that your calculations will modify all of them. I guess what I'm asking is what is "large number of edits"?

How many attribute fields do you have in your GDB?

I can then try to reproduce it on my side.

regards,

David

0 Kudos
MichaelDavis3
Frequent Contributor

Yes - in this case we have a transit efficiency study that uses background location tracking to collect track points every X seconds while surveyors ride various bus routes and document delays in service.  While the track data is being recorded the app is keeping a running tally of the cumulative distance of the current track and writing it to a field each time it collects a point.  

Recently we've run into instances where the ObjectID is being written into the distance field instead of the distance value.  I haven't had a chance to diagnose why that is happening just yet, but I wrote up a quick routine to iterate through the database and recalculate the cumulative distance values for a survey, or for the entire dataset.  This is what is creating the 300k edits.

I believe the track point feature class has ~15 fields.  The only field modified is the cumulative distance (double) field.

0 Kudos
ChristopherMilack1
Regular Contributor

I haven't run into this exact issue but have a number of other issues with sync over the last several years. One recurring problem that I had was that my app would freeze up while preprocessing the delta geodatabase. It would happen even when I had a small number of records. I've also found that the larger the geodatabase, the more time these operations seem to take (both pre and post processing of the sync).

I would recommend running the app in the instruments time profiler and see if/where the app is getting hung up. I would not be surprised if you just have a really lengthy preprocessing step that just never finishes.

0 Kudos
MichaelDavis3
Frequent Contributor

In this case it isn't the app - the diff database is successfully uploaded to ArcGIS Server - the server just silently stops at some point and doesn't complete the data sync.

I'm not super clear how sync is handled internally on ArcGIS server, but for what it's worth, when I convert the database and join the records to manually move the edits across field calculations don't seem to be calculating so perhaps it's an issue with the database server.  We've already boosted RAM on that machine to deal with issues we've run into with the archive maxing out the memory but perhaps something else is gumming up the works.

0 Kudos