Versioned Data Runs Slow

4453
5
Jump to solution
01-22-2014 11:04 AM
Highlighted
New Contributor II
Using ArcGIS for Server 10.1.
We replace/update a polygon Feature Class (< 60,000 records) located in an SDE Feature Dataset on a monthly basis with a Feature Class from outside our SDE (same schema).  We have Default and Edit Versions of the geodatabase.

Here are the two procedures we???ve tried so far:
1) With the Edit Version of the Dataset, we tried deleting all of the records, copying the features from the update, and pasting them into the Edit Version (also tried copying, then deleting old).  We???ve also tried deleting the features and appending new features (also tried appending, then deleting old).  Both of these produce a VERY SLOW layer which takes a long time to load, and makes edits impossible.

2) We tried importing the update into the dataset (right-click on datset -> Import Feature Class), which is still fast and easy to use, but it comes in as unregistered as versioned (and uneditable??? making all of the Feature Classes in the feature dataset temporarily uneditable).  When we register it as versioned, the speed slows WAY down.

We???ve also tried these Geodatabase Administration tools:
A)    Analyze the Feature Class
B)    Compress the Geodatabase
C)    Recalculate the Index

Note: All of our other versioned data is managed through Edit versions that we push up to Default, with no slowness issues.

Thanks for your assistance.
Reply
0 Kudos
1 Solution

Accepted Solutions
Highlighted
New Contributor II
Thank you William and Leo for your time and help!  We finally got it to speed back up, and William it did have to do with the spatial index!

Re-reading my posts, I didn't properly explain that the slowness was only occurring when viewing/editing/identifying/etc. the data in ArcMap.  The deleting, copying, and appending were not slow, so I hope that didn't throw you off too much.

Jack Horton at ESRI's Olympia office (thanks Jack!) helped walk us through many of the steps that you both had mentioned, including looking at the tables on the SQL side (no problems there), Geodatabase setup (nothing wrong), and then we looked at the spatial index of the feature class.

Both the spatial index and the extent had to be changed.  Here was our process:
- Right-click on the feature class
- In the Indexes tab, delete the spatial index
- Create/add the spaitial index, change the bounding box numbers (they were way off)
- In the Feature extent tab, change the XY extent (they were way off)
- Compress Database
- Rebuild Indexes
- Analyze
It is now fast again!

Thanks again!

View solution in original post

Reply
0 Kudos
5 Replies
Highlighted
MVP Regular Contributor
From your description, I'm not entirely clear if the performance issues you are experiencing are with the draw time of the feature class within ArcGIS or with the process of deleting and inserting of records.  Perhaps it is both workflows that are yielding slow behavior. 

Two quick questions come to mind right away:

Is the performance of your non-SDE source data the same, better, or worse than that of the SDE data you are describing? 

In thinking about the complexity of your polygon features, do they have millions of vertices or only a few per polygon?

You might see better performance with the following sequential steps:

1. TRUNCATE
When connected to your Edit version (which is assumed to be a child version of Default), use the Truncate GP tool to wipe out the records from your polygon feature class.  It sounds like you need to clean it out entirely, but correct me if I'm wrong here.  Truncate works much more efficiently than a delete operation, but doing this will immediately commit your transaction in the RDBMS without the ability to roll back. 

2. LOAD
You can use the Simple Data Loader tool in ArcCatalog or the Load Objects tool in ArcMap to insert the records from your non-SDE source into your target geodatabase.  Make sure you're connected to your Edit version when doing this. 

3. RECONCILE / POST
Since you loaded data into the Edit version, those changes need to be "moved" into the Default version using the reconcile and post tools within ArcGIS Desktop. 

4. COMPRESS
You mentioned a compress earlier, and you are correct in that thought process.  After reconcile and post, compress your geodatabase as while connected to your Default version as the SDE user in order to move your adds and deletes to the base table. 

5. RDBMS MAINTENANCE TASKS
As you eluded to earlier, performing other RDBMS-related maintenance tasks can help to increase performance.  These include analyzing tables and indexes as well as rebuilding indexes.  There are many, many other types of database maintenance tasks that can help increase performance and are described online on a variety of sites. 


On a related note, I personally do not recommend deleting a feature class and then importing data in order to create a new one... especially if the data you're working with is intended to be versioned.  If the data does not need to be versioned, then it is less of a challenge to do this.  (Permissions have been the main concern in that specific case, although this might be automatically handled in newer ArcGIS software if the data resides in a feature dataset.)  If the data needs to be versioned, then each time you delete the old feature class and add a new one via Import you will need to re-register it as versioned as you discussed already. 

It is possible that your performance issues are related to other problems and have nothing to do with the methods you're using to delete and insert new data.  Therefore, if the above workflow doesn't help you I would recommend taking a look at your database server's memory, disk I/O stats, and CPU.  It could be data-related.  Otherwise, it could be network-related.  Performance issues are tough to troubleshoot, especially via a forum post.  It really could be just about anything that is causing the slow behavior.
Reply
0 Kudos
Highlighted
New Contributor II
From your description, I'm not entirely clear if the performance issues you are experiencing are with the draw time of the feature class within ArcGIS or with the process of deleting and inserting of records.  Perhaps it is both workflows that are yielding slow behavior. 
Two quick questions come to mind right away:
Is the performance of your non-SDE source data the same, better, or worse than that of the SDE data you are describing

The data in non-SDE format is much faster.  Even when importing it into the SDE it's fast, but as soon as it's versioned it slows WAY down.  We've also exported the slow SDE version to a local file geodatabase, and it's fast again.
In thinking about the complexity of your polygon features, do they have millions of vertices or only a few per polygon

Not very many vertices per polygon.  The Feature Class is about 20MB and there are less than 60,000 features

1. TRUNCATE
When connected to your Edit version (which is assumed to be a child version of Default), use the Truncate GP tool to wipe out the records from your polygon feature class.  It sounds like you need to clean it out entirely, but correct me if I'm wrong here.  Truncate works much more efficiently than a delete operation, but doing this will immediately commit your transaction in the RDBMS without the ability to roll back.

Thanks for the idea, however when I try "Truncate Table" in the the Edit Version (yes, it is a child version of Default), I get "ERROR 001400: Only the data owner may execute truncate".  If I try it on the Default version, I get "ERROR 001259: Operation not supported on a versioned table".

2. LOAD
You can use the Simple Data Loader tool in ArcCatalog or the Load Objects tool in ArcMap to insert the records from your non-SDE source into your target geodatabase.  Make sure you're connected to your Edit version when doing this. 

Thanks, this is a great tool (right-click on feature dataset -> Load -> Load Data)!  It works great, and I'll definitely use it from now on!

3. RECONCILE / POST
Since you loaded data into the Edit version, those changes need to be "moved" into the Default version using the reconcile and post tools within ArcGIS Desktop.

We reconcile/post all of our data on a regular basis.  Every layer except for this one performs great, even other feature classes within the same feature dataset.

Here's something else we just tried:
- With the Edit (child) version, deleted all of the features, saved edits
- Reconciled/posted
- With the Edit Version added just 15 features (14 KB)
- Reconciled/posted
This doesn't speed anything up at all.  It's still really slow.
Reply
0 Kudos
Highlighted
MVP Regular Contributor
Two things I would do at this point in time to see if it resolves your issue:

1. Check the grid sizes on your spatial index of the feature class you are importing in SDE.  Are they all zeros or have they been automatically set by ArcSDE?  Try recalculating the spatial index via ArcCatalog, but you need to be connected as the data owner to do so.  The grid sizes may or may not change when you do this.  If they do change, try drawing the feature class again to validate performance.  If they do not change, manually change them to 1000,0,0 and re-check performance.  Another combination to try is 1000,3000,0 or 1000,3000,9000.  If none of these options work, reset the spatial index back to what it was or let ArcGIS calculate it again. 

2.  Identify all of the attribute indexes for your base table (the polygon feature class table in SQL Server), the D table (deletes table), and the A table (adds table).  You can find out which A and D tables correspond to your base table by using a modified version of this query (written for oracle, but can be modified slightly for SQL Server):

select
substr(l.TABLE_NAME,1,30) table_name,
substr('F'||layer_id,1,10) feature_table,
substr('S'||layer_id,1,10) spatial_table,
substr('A'||registration_id,1,10) adds_table,
substr('D'||registration_id,1,10) deletes_table,
substr(l.OWNER,1,12) owner
from     sde.layers l, sde.table_registry r
where    l.owner NOT IN ('SDE', 'SYS')
and      l.table_name = r.table_name (+)
and      l.owner      = r.owner (+)
order by owner, table_name

Once you identify the A and D tables, go into SQL Server Management Studio and manually rebuild the ATTRIBUTE indexes for the base, A, and D tables and then re-check performance.


Lastly, the truncate operation error you are seeing about the data owner being required means that you need to connect to the geodatabase's Edit version as a data owner to perform the truncate.  You won't be able to do this on the Default version nor would you want to.  Have a look at this article to read more about the geodatabase owners:

http://resources.arcgis.com/en/help/main/10.2/index.html#//002q00000004000000
Reply
0 Kudos
Highlighted
Regular Contributor
Eric,

You say you reconcile/post with the edit version.  And when you try to use the Truncate GP tool, you get the error that you are not the owner.

Are you sure you are successfully reconciling and posting your edits using the right user account?  Who owns the edit version?  Have you checked the adds and deletes tables for these layers?  Are they empty?

If you aren't reconciling/posting, all those edits will cause the slowness you have described. 

Also, what is the value of your TEMP variable in this table: SDE_server_config
Reply
0 Kudos
Highlighted
New Contributor II
Thank you William and Leo for your time and help!  We finally got it to speed back up, and William it did have to do with the spatial index!

Re-reading my posts, I didn't properly explain that the slowness was only occurring when viewing/editing/identifying/etc. the data in ArcMap.  The deleting, copying, and appending were not slow, so I hope that didn't throw you off too much.

Jack Horton at ESRI's Olympia office (thanks Jack!) helped walk us through many of the steps that you both had mentioned, including looking at the tables on the SQL side (no problems there), Geodatabase setup (nothing wrong), and then we looked at the spatial index of the feature class.

Both the spatial index and the extent had to be changed.  Here was our process:
- Right-click on the feature class
- In the Indexes tab, delete the spatial index
- Create/add the spaitial index, change the bounding box numbers (they were way off)
- In the Feature extent tab, change the XY extent (they were way off)
- Compress Database
- Rebuild Indexes
- Analyze
It is now fast again!

Thanks again!

View solution in original post

Reply
0 Kudos