Select to view content in your preferred language

Terrible SDE Performance

3252
9
03-10-2011 04:33 AM
ChrisMcKee
Emerging Contributor
Hello...

Just wondering if anyone has any insight into extremely poor performance after the following scenario:

Update a portion of a large dataset:

  1. Delete features

  2. Load in new features


Attempted solutions:

  • Recalculate spatial grid

  • Analyze tables (Business, Feature, Adds and Deletes)


Following loading the new data it will literally take a full minute to draw the entire dataset (not just the new parts).  Before the new data was deleted/added, it would draw in a few seconds.

I really don't want to have to create a new feature class and swap them out, it's bit of pain, especially when I want to automate the entire process.  I just need to get past this awful performance. 

We are using SDE 9.3.1 on SQL Server

Forrest
0 Kudos
9 Replies
VinceAngelo
Esri Esteemed Contributor
What tool(s) are you using to make the changes?

Is the table versioned?

What geometry storage option are you using?

How many rows are involved (total, deleted, & inserted)?

What is the envelope which is defined for the layer? What is the actual envelope
of features in the dataset?

"Draw entire dataset" should be pure database operation, which is dependent solely
on the size of the database cache and the number of rows to return.

- V
0 Kudos
ChrisMcKee
Emerging Contributor
What tool(s) are you using to make the changes?

  • My first test was manually deleting a portion of 'parcels' in an edit session

  • My second test was to empty the entire feature class using 'Delete Features' tool in tbx

  • To load in the new ones I literally did a 'load data' by right clicking the FC.  I will eventually be using the Append tool in the tbx.


Is the table versioned?

  • Yes

What geometry storage option are you using?

  • These are relatively simple, projected polygons, though some have a large number of vertices

How many rows are involved (total, deleted, & inserted)?

  • Total parcels 100,000+/-.

  • High number of changed would be 50+ thousand, but I have seen the same performance when simply swapping out 100+/-.


What is the envelope which is defined for the layer? What is the actual envelope
of features in the dataset?

  • As these could be part of a regional update or entire provincial update the envelope would be different every time.  For this example think large and small scale envelopes.


Thanks,

Forrest
0 Kudos
VinceAngelo
Esri Esteemed Contributor
Old questions, revisited:

> What geometry storage option are you using?

   SDEBINARY or GEOMETRY/GEOGRAPHY?

> What is the envelope which is defined for the layer? What is the actual envelope
> of features in the dataset?

   I'm looking the actual layer envelope and MBR of *all* features. The purpose here
is to determine if a "draw all features" is doing a spatial query (it shouldn't, but an
incorrect layer envelope could throw off the ArcSDE optimizer).

New questions:

Since your table is versioned, how often do you reconcile and post features to the
base table?  What is the total row count in your A and D tables?

What is the average number of vertices per feature ('sdelayer -o stats' output)?

How is your coordinate reference defined ('sdelayer -o describe_long' output)?

- V
0 Kudos
ChrisMcKee
Emerging Contributor
What geometry storage option are you using?
Sorry I mis-interpreted the question.... we are using SDEBINARY

What is the envelope which is defined for the layer? What is the actual envelope of features in the dataset?


  • This sounds promising but I am not sure how to check the actual envelope.  Are you referring to the spatial envelope?

  • [INDENT]
  •                 minx:  -1479597.40040,  miny:    505471.22000

  •                 maxx:   2269595.76810,  maxy:   2067275.29180[/INDENT]


Since your table is versioned, how often do you reconcile and post features to the
base table?  What is the total row count in your A and D tables?

Well currently this is a new feature class so it is only in testing... However, between emptying and reloading I analyzed the tables so the A&D's should have been cleared, as well, after the load I analyzed so they should not have been the issue.

What is the average number of vertices per feature ('sdelayer -o stats' output)?

  • Minimum Feature Number:                  1

  • Maximum Feature Number:             407635

  • Largest Feature:                      9565 Points

  • Smallest Feature:                        4 Points

  • Average Feature:                     19.87 Points

  • Minimum Polygon Area:                           0.00484 sq.

  • Maximum Polygon Area:                    71092824.10687 sq.

  • Average Polygon Area:                      111215.02587 sq.

  • Minimum Polygon Perimeter:                      0.44173

  • Maximum Polygon Perimeter:                  68832.04798

  • Average Polygon Perimeter:                    844.16128


How is your coordinate reference defined ('sdelayer -o describe_long' output)?


  • ArcSDE 9.3.1  for SQL Server Build 3056 Fri Sep 10 09:41:17  2010

  • Layer    Administration Utility

  • -----------------------------------------------------

  • Layer Description ....: <None>

  • Database              : CLIENT_NAME

  • Table Owner ..........: CLIENT_NAME

  • Table Name ...........: CLIENT_NAME_PARCELS

  • Spatial Column .......: SHAPE

  • Layer Id .............: 42

  • SRID .................: 1

  • Minimum Shape Id .....: 1

  • Offset ...............:

  •   falsex:  -16688100.000000

  •   falsey:   -9068200.000000

  • System Units .........:      10000.000000

  • Z Offset..............:          0.000000

  • Z Units ..............:          1.000000

  • Measure Offset .......: <None>

  • Measure Units ........: <None>

  • XY Cluster Tolerance .:          0.001

  • Spatial Index ........:

  •   parameter:    SPIDX_GRID,GRID0=940,GRID1=5640,FULL

  •   exist:        Yes

  •   array form:   940,5640,0

  • Layer Envelope .......:

  •   minx:  -1479597.40040,        miny:    505471.22000

  •   maxx:   2269595.76810,        maxy:   2067275.29180

  • Entities .............: nac+

  • Layer Type ...........: SDE

  • Creation Date ........: 02/09/11 16:15:12

  • I/O Mode .............: NORMAL

  • Autolocking ..........: Enabled

  • Precision.............: High

  • User Privileges ......: SELECT, UPDATE, INSERT, DELETE

  • Coordinate System ....: PROJCS["North_America_Albers_Equal_Area_Conic",GEOGCS["G

  • CS_North_American_1983",DATUM["D_North_American_1983",SPHEROID["GRS_1980",637813

  • 7.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],P

  • ROJECTION["Albers"],PARAMETER["False_Easting",0.0],PARAMETER["False_Northing",0.

  • 0],PARAMETER["Central_Meridian",-105.0],PARAMETER["Standard_Parallel_1",20.0],PA

  • RAMETER["Standard_Parallel_2",60.0],PARAMETER["Latitude_Of_Origin",40.0],UNIT["M

  • eter",1.0]]


  • Layer Configuration ..: CLIENT_NAME_VECTOR


Thanks,


Forrest
0 Kudos
VinceAngelo
Esri Esteemed Contributor
There are many ways to obtain the actual envelope, including zooming to the window and
writing down the LLx/y and URx/y, and using 'sdelayer -o alter -E calc'. Is your dataset really
10:1 in width:height? (2000km x 200km)

Your vertex count is relatively small, but you might see performance savings by using
an xyscale of 1000 (millimeter precision) or 100 (centimeter precision) vice the 0.1mm
default.

The minimum area indicates a *parcel* of 0.005 square meters (roughly 3"x3") -- that seems
questionable (not many lots sold with the fooprint of a fencepost -- how would you keep out
trespassers? 😉

- V
0 Kudos
ChrisMcKee
Emerging Contributor

There are many ways to obtain the actual envelope, including zooming to the window and


writing down the LLx/y and URx/y, and using 'sdelayer -o alter -E calc'. Is your dataset really
10:1 in width:height? (2000km x 200km)

  • minx: -1,500,000m

  • maxx: 2,300,000m

  • miny: 500,000m

  • maxy: 2,070,000m

  • =3,800km(x) x 1,570km(y)



  • It's basically an amalgamation of all of our parcel datasets for a single continuous dataset.


Your vertex count is relatively small, but you might see performance savings by using
an xyscale of 1000 (millimeter precision) or 100 (centimeter precision) vice the 0.1mm
default.

  • To alter this precision do I have to reload the data (create new feature dataset) or can I do it once it's already in there?


The minimum area indicates a *parcel* of 0.005 square meters (roughly 3"x3") -- that seems
questionable (not many lots sold with the fooprint of a fencepost -- how would you keep out
trespassers? 😉

  • They are just really small fence posts or 'slivers'...lol

0 Kudos
VinceAngelo
Esri Esteemed Contributor

My bad. I parsed the coords minx maxx\n miny maxy.



Coordinate references are fundamental properties, determined at layer creation; you'd
need to reload to change the precision.

- V

0 Kudos
ChrisMcKee
Emerging Contributor

Do you see anything inherently wrong in how this is setup that would cause performance issues only after data is emptied and reloaded? 



I'm just at a loss because when I analyze tables I would expect this would be resolved but it isn't. 

When I select the layer to 'preview' it take a long time before it even begins to draw, let alone draw.  Is there something wonky about the indexing or it's ability to start querying the data?

0 Kudos
VinceAngelo
Esri Esteemed Contributor

It's time to open an incident with Tech Support. Be sure to provide them the exact


versions of software (both Esri and database., to the service pack or patch level).
They'll need a copy of the dataset and the exact steps you're taking as well.

- V

0 Kudos