SDE compress with locks

1030
2
02-05-2014 09:23 AM
AndrewRudin
Occasional Contributor III
I am trying to come up with a compress workflow. I was hoping someone had clarification for what happens to states when users or web map services are looking at the data when a compress is run.

Here is the scenario:  We have a primary SDE that all general users and all web services access to use GIS data.  Then we have an editing SDE where GIS editors modify the data.  That data is then synced to the primary SDe nightly.  Currently the primary SDE is entirely unversioned, however now we want to host a dataset with a geometric network.  To update this dataset the primary SDE will need to have versions(we'd probably just edit default directly) and once the sync is complete we'd need to a compress.  However its my understanding that if there is a lock on states in an SDE then the compress will ignore the lineage for the state that's being viewed.  So I'm not sure we'd get an effective compress if lots of users and web maps are looking at DEFAULT.  I'd like to avoid kicking everyone off the server if I can, even if its at night since the web services are accessed external to our organization.  There really isnt any other editing on the server... only nightly python/FME data pushes through non-versioned operations.

Thanks for any help you can provide
0 Kudos
2 Replies
WilliamCraft
MVP Regular Contributor
Upfront Comments: Please provide the exact versions of Esri and RDBMS products you are using. 

We have a primary SDE that all general users and all web services access to use GIS data.  Then we have an editing SDE where GIS editors modify the data.  That data is then synced to the primary SDe nightly.


Questions: How is the data being synced?  Which specific Esri GP tools are you using with Python and FME?   

Currently the primary SDE is entirely unversioned, however now we want to host a dataset with a geometric network.  To update this dataset the primary SDE will need to have versions(we'd probably just edit default directly) and once the sync is complete we'd need to a compress.


Comment: You are correct in that you would need to version your primary to receive replicated changes from your editing GDB.  This is the case whether you include a geometric network as part of the replica definition or not.  I will caution you, though, that depending on the version of ArcGIS you are using you will experience a long wait during synchronization for your geometric network to rebuild on the child replica (i.e., your primary).  The network rebuild logic is not very efficient for client-side rebuilds during replication synchronization.  If you must use replication, just be prepared for this. 

Question: Are you only considering using versioning because you have a geometric network that you wish to exist in the primary?  I'm trying to understand the technical reason for why you are thinking about introducing replication. 

However its my understanding that if there is a lock on states in an SDE then the compress will ignore the lineage for the state that's being viewed.  So I'm not sure we'd get an effective compress if lots of users and web maps are looking at DEFAULT.  I'd like to avoid kicking everyone off the server if I can, even if its at night since the web services are accessed external to our organization.  There really isnt any other editing on the server... only nightly python/FME data pushes through non-versioned operations.


Comment: Compression is a common workflow for anyone with a versioned geodatabase.  You will not ever achieve a full compress (to state 0) if there are connections which put locks on particular states as you elude to.  However, compressing regularly will help to keep your lineage trimmed.  That being said, I think synchronizing from an enterprise GDB parent to an enterprise GDB child replica would put the edits into the delta tables on the child rather than make those edits directly to base.  You'll have to read through the Esri documentation to know this for sure, unless others comment on this thread who know the answer to that.  If the edits are put into the A and D tables, you'd need to compress the child replica too.  To actually move the edits from the deltas to the base tables, I think you would need everyone to disconnect in order to get to state 0 to do this.  That includes GIS services which access your GIS data from the primary.
0 Kudos
AndrewRudin
Occasional Contributor III
SDE Enterprise 10.2 (Oracle11gR2) and Desktop 10.2. I will use FME to edit the primary SDE.  We don't use replication since it has so much overhead.  It requires versioning, and globalid's and the schema's have to match between databases (yuck!).  Just give me a tool to read a UNIQUE_ID column and a date_field that holds the last time each feature was updated from both databases and figure out the inserts/updates/deletes/unchanged records.  Plus FME has much better scheduling abilities.

On the publishing SDE I want the geometric network available so our engineers can use it for tracing.  It needs to have network flow set to the digitized direction, and this is the one step that forces me to version the data, no matter how hard I try to avoid it.

Here's the workflow I was originally going for:
I originally planned for the dataset on the primary viewing SDE to be unversioned for simplicity.  Our GIS administrators are very leery about having a versioned dataset on such a heavily used database.

1)  Editors on the editing SDE make edits in versions.  Custom tools populate a UNIQUE_ID field based on an oracle sequence and update a MODIFIED_DATE field.
2) Editor versions are posted to default when their complete
3) Each night, FME workspace runs and compares ID's/dates and pushes inserts/updates/deletes.  The FME SDE30 writer will edit the feature classes directly, bypassing edit session and versioning requirements.
4)  Once FME script is done all feature classes have the latest data
5)  Recreate the geometric network using the arctoolbox tool, since FME doesn't update the background tables that drive the geometric network.
6)  Version the dataset
7)  Run the set flow toolbox tool to get digitized direction flow
😎  Unversion dataset and choose to have edits in Default version compress to base.

This workflow would prevent the database from having to be versioned as little as possible, however steps 5,6, and 8 will definitely require no schema locks to exist.

I saw that at 10.2.1 the 'Rebuild' and 'Repair' network tools are in the toolbox now so I guess I dont need to delete and recreate the network after all.

To avoid making the schema changes each night I was thinking of instead leaving the dataset on the primary SDE permanently versioned.  Then I'd use the slower FME writer that writes to versioned SDE's using proper ESRi edit sessions.  It would right directly to the default version.  Then I'd probably need to run the Repair and 'Set Flow' tools on the network since I don't think FME is great at handling these while writing features.  The last big step is the compress to make sure the edits make it into the base table so the database is nice and clean.  If there are users and web maps looking at the default version when the compress happens at night, will it still compress the edits I just made to base?
0 Kudos