I'm working with a dataset that's over 7,000,000 (7 Million) features. Using ArcDesktop, I'm finding that with as little as 2 users, we see a significant slowdown. The CPU on the system is generally 30% or less.
For nightly maintenance, we run the following:
At this point, we come to a compression state of 1.
I'm looking into other options to increase performance without adding additional hardware.
Thoughts?
Providing a bit more information would be helpful. What edition of SDE? Personal, Workgroup, Enterprise? What backend DBMS and what version? How much RAM and what is your disk arrangement like? Points? Polygons? How many fields besides the spatial column?
There are so many factors that may be impacting your results, that the best we can offer is speculation without more information.
Keep in mind that we can't increase the resources. We don't have approval for this at this time.
Please include details of what is being done with those 7 million points that causes "slowdown", how those 7 million features are distributed across feature classes, how many features are being drawn in any spatial envelope, the geometry storage option being used (and if it is different by feature class, which feature classes use what storage). An indication of the sequencing of the features, with respect to spatial fragmentation (that is, does a "draw all features" fill the viewfield from left to right or top to bottom, or does it draw randomly), would also be useful.
- V
This happens when adding a point, deleting a point, adding a road, deleting a road. Y'know, the basics.
We have 1,000,000 features in the Roads layer and 6,000,000 features in the Points layer.
See Witch Magic, Snake Oil Medicine, and Spatial Index Tuning . If you're using the default spatial index that comes with making a new FC via arc catalog, that is most certainly the main cause of your bottle neck. This is assuming that you're using the version of GIS that defaults to the Geometry storage type.