Non-versioned data slowing way down

1617
16
Jump to solution
04-21-2014 06:21 AM
MattLane
Occasional Contributor II
I have two unversioned featureclasses in a dataset that our organization adds and deletes up to 10s of thousands of records daily. One is over 3million points and the other is over 500k lines. Each have an organizational ID field with an index and a GlobalID field with an index. We will be versioning and replicating these in the near future. They are getting slower and slower over time, to the point where just deleting a couple records could take 8 minutes and getting a table feature count using the toolbox tool takes 3 minutes. The processing that used to take 1 to 2 hours now takes over 12.

Details:

  • SQL Server 2008 R2

  • SDE Release 101010

  • Desktop 10.1 sp1

  • Each spatial index is built by ESRI (16 cells per object, Medium on all levels)

  • Data is not in SDE schema, but a separate one

  • There isn't a lot of other data in this database, although there is a single versioned featureclass in a different dataset that has topology rules applied to it.


What I've tried:

  • I rebuild and reorganize all indexes in this database as needed according to Microsoft documentation

  • Checked the Bounding Boxes on the spatial indexes

  • Rebuilt statistics (esri tools)

  • Compressed

  • Shrunk database (our DBA does this periodically)

  • Restarted server (memory doesn't seem to be the issue)

0 Kudos
16 Replies
MarcoBoeringa
MVP Regular Contributor
The display of the data in ArcMap is actually really fast. But adding it to ArcMap takes minutes. Just selecting it in ArcCatalog freezes the application for minutes. Other layers in the database, even fairly large ones, are instantaneous. This all makes me think there is some cruft in the SDE tables, or at least something wrong in them.


Do you, by default, work with ArcCatalog with the "Preview TAB" selected? Unfortunately, ArcCatalog always tries to draw the entire extent / dataset when hitting or opening the "Preview TAB", which may cause an apparent application hang. This voids any use of indexes or proper spatial ordering of the datasets in the underlying tables (spatial fragmentation), as it will need to read the entire table to draw the dataset.

If so, you would do good to avoid using the Preview TAB as much as possible, and only draw the datasets in ArcMap with a proper minimum and maximum display scale applied through the layer settings. It is also good practice to always access *.lyr files on a file system referencing your Feature Classes in the geodatabase using ArcCatalog, instead of accessing the Feature Classes directly through a database connection, as you will be a able to pre-define and limit display scales, set symbology and appropriate labelling.

It would also be wise to post some stats / specs about server hardware used, LAN speed, and the number of users on the servers and using these Feature Classes at any given point in time.
0 Kudos
MattLane
Occasional Contributor II
No, this isn't previewing the data, this is just clicking the name in the contents tab. As in I want to right-click and select properties, which freezes ArcCatalog for 2 minutes before it shows the right-click menu.

I have software info in the initial post, I don't have hardware or network info and will have to ask. I can tell you that the DB servers are virtualized with 16gb of memory, and I have had the DBA restart the server to "free up memory" which didn't help in the slightest.

There is only 1 user during the day intermittently. I'll point out again that this has been a constant degradation over time. Things that used to be instantaneous on the dataset now take up to 4 minutes to do.

I guess I'll contact support today.

Thanks everyone.
0 Kudos
George_Thompson
Esri Frequent Contributor
Matt,

I wonder if the geoprocessing history for those two feature classes is getting very large. Every time that you perform a GP process it adds to the metadata. The metadata is stored in a geodatabase system table.

You may want to look into this add-in: Delete Geoprocessing History from Metadata

Got the link from this post: http://forums.arcgis.com/threads/43399-metadata-geoprocessing-history

I would only do this after you have a verified back of the geodatabase.

Try it on the two feature classes in question

-George
--- George T.
0 Kudos
WilliamCraft
MVP Regular Contributor
George, I see there are separate tools/scripts for deleting geoprocessing history metadata from 9.x and 10.x geodatabases.  Which ArcSDE system tables get cleaned out as a result of running these tools/scripts?  Is it METADATA, GDB_USERMETADATA, or both?
0 Kudos
George_Thompson
Esri Frequent Contributor
Just the geoprocessing history part of the metadata gets deleted. This can become rather large over time. The Documentation field is located in the SDE.GDB_ITEMS table, it is an XML field.

I just tested this on a local instance of SQL Server(2012), using ArcGIS Desktop 10.1 SP1 w/QIP installed, and it just deleted the Geoprocessing History for the feature classes that I selected. I verified in the Metadata, for those feature classes, that the geoprocessing section is now empty. I only tested this on a 10.x geodatabase, not a 9.3.x version.

System Tables of a Geodatabase in SQL Server

Here is a link to a KB article that talks about how to do it via Python:
Automate the process of deleting geoprocessing history

I would recommend doing this after you have a full, verified backup of the geodatabase or on a test instance. This is in case there are any hiccups during the process.

Hopefully this helps clear up some questions.
--- George T.
0 Kudos
WilliamCraft
MVP Regular Contributor
Thanks for clarifying, George.  That is helpful.  I'm curious to hear if this help's Matt's performance issues as described above.
0 Kudos
MattLane
Occasional Contributor II
The metadata's Geoprocessing History proved to be the culprit. My processing has dropped to 1/8th the time it was taking. The lengths of the metadata fields for the two feature classes in the GDB_ITEMS table went from 27million and 19million in length down to 1500.

Thanks George, you're my new hero!
0 Kudos