Domain Resolution and Tolerance - Anomalies

3082
8
03-09-2021 10:46 AM
DanielSteuber
New Contributor II

Hey all,

So I've noticed increased sluggishness with multiple feature classes in specific datasets within our SDE database regarding being drawn on screen. If I export/import them into another dataset, they will load quick and efficiently. It would be nice to avoid this as the overall solution as I would have to go through and fix a lot that links back to them via the existing pathways.

The one common denominator I've found is the Domain Resolution and Tolerance tab will have some crazy figures on the slow loading dataset offenders. Is there ANYWAY to correct this without going through and starting over with new datasets? Is that even a possibility for a cause of my sluggishness problem to begin with? They were not always this way, which leads me to hope that since they were changed, they can be changed again(?). We're currently in the middle of a billing system upgrade that touches our GIS and it would not surprise me if something was messed with. I've gone through and compressed the database already, no dice.

That being said, please, talk to me like I'm 8 years old as I'm not a specifically formally-educated GIS guru and you're not going to hurt my ego hahaha See attached for some screenshots of good and bad dataset Domain Resolution and Tolerance tabs. Thanks in advance if you can offer any advice on this!

0 Kudos
8 Replies
DS_GIS
by
New Contributor III

Hello, 

Out of curiosity have you been running database maintenance steps such as: Rebuild Indexes, Analyze Datasets, and Compress database?

I have had services become quite sluggish and these steps will increase the efficiency of the database along with the services within. 

Sincerely,

Danielle

DanielSteuber
New Contributor II

Hey Danielle, thanks for the quick response. So I had analyzed the datasets and compressed the database, but I had not tried rebuilding the indexes. I'm going through now and will respond when I've hit them all. On some preliminary checks in SSMS, I am finding some significant fragmentation so here's hoping that's all it is! I'll pop back in here when I'm done with the findings/results. Thanks!

0 Kudos
DanielSteuber
New Contributor II

Update: Nope, no dice on the rebuilding being the solution. Still the same sluggish response from those datasets. But hey it needed to be done regardless, so more thing to try checked off the list anyways!

0 Kudos
DS_GIS
by
New Contributor III

I'm so sorry to hear that! I've had quite a few different services with issues being sluggish and found that this was the solution. Out of curiosity, did you connect to the SDE as the system admin so you could include the system tables when you ran those steps?

Another thing I just considered, are any of the services in this SDE published to a server? Or are they only worked on within the SDE? Also, do you have versioning applied to this dataset?

0 Kudos
DanielSteuber
New Contributor II

So I'm in as the admin, no issues there. All feature classes are registered as versioned.
You did hit on something I had in the back of my mind though and I did a little investigating. Yes these feature classes are published as services on our separate GIS Webserver that connects to this main SQL Server. We do this for access to Collector/Field Maps on our iPads, but there really aren't that many people on it at once. Maybe 6-8 maximum at a time, and I still get the slow responsiveness when I kick people off, or disconnect the WebServer entirely from the network. Other features that are published on the Webserver outside of the datasets I talked about previously seem to be fine as well, so I don't think that specifically is the issue. Very strange indeed.... Thanks for the ideas!

0 Kudos
DS_GIS
by
New Contributor III

I'm glad this gave you some good ideas! I suppose the way you are connecting to your server could have some latency. For example: if you use a VPN. If all servers are on a cloud platform it should be fairly fluid. That being said, the added complexity of Collector can certainly impact this. We have a large ongoing field project I run database maintenance for. I don't know it it's helpful. But, this is my workflow. 

1) Connect to the SDE as DBO > Right-click to open 'manage replicas' > check from where I last left off that all replicas have sent and received to the server.

2) reconcile & post all versions > I chose in favor of attribute

3) As DBO > Compress the SDE > Rebuild indexes > Analyze Datasets

4) As Sys Admin > Rebuild Indexes (include system tables) > Analyze Datasets (include system tables)

0 Kudos
VinceAngelo
Esri Esteemed Contributor

I'm getting HTTPS errors on your file links. If you put the parameters in the question as text, then no one would need to download an image for the definition of "crazy."

Yes, spatial reference tolerances can have an impact on the compression algorithm used to store geometry, but only when using Esri's ST_GEOMETRY storage (you haven't specified the RDBMS or geometry storage type, either).

Yes, correcting the feature dataset spatial reference generally requires a complete reload of all the feature classes, and replacing a feature class spatial reference generally requires recreating and reloading the feature class.

- V

0 Kudos
BasuMathapati
New Contributor II

I too looking a solution for the comment you mentioned :

...found is the Domain Resolution and Tolerance tab will have some crazy figures..

 

Did you find a better way to create a fresh dataset\feature class and set these figures right ?

I observed domain values sets out automatically while creating FD or FC, is that efficient ? but those domain values are too wide than my area of interest where I'll be creating my features.

Feature Extent can be recalculated, but what about these 'Domain, Resolution & Tolerance' tab ?

Any one has any solution ? come across this ? or should we simply ignore these domain values ?

0 Kudos