Contour Feature Class Gets Huge When Copied to SDE

1645
14
05-18-2020 05:20 PM
MattDeitemeyer
New Contributor III

I have a contour feature class that takes up 2 GB in a File GDB. When I move it to SDE the disk space quadrupled. This is a SQL Server instance of SDE. I've tried a number of different methods. I have the same problem regardless if I use ArcGIS 10.5.1 or PRO. I've tried SQL spatial and ESRI binary types for the destination feature class. To move the data I've tried; Load Features, Append, Copy Features, and an old fashion copy/paste. I keep getting the same result, a huge 'file' that takes forever to render. I can move the data between File GDBs with no problem. Any ideas?

0 Kudos
14 Replies
MattDeitemeyer
New Contributor III

I have a FGDB version on the network and locally, they both behave fine. The network version is not on the same server as SQL Server. I'll need to coordinate with our DB admin to watch the server while drawing the feature class. We are running SQL Server version 12. I should add, size on disk is a concern for the DB admin - it's a secondary issue for me. Rebuilding the spatial index didn't help. I have done a few things to the feature class to drive down size; tolerance and resolution were altered. I ran an identity (and a multi to single part operation) to tile out the features so the spatial index could perform better. The contours are for visualization. Simplify failed with an error of 'out of memory'. Generalize causes overlapping contours, the interval is 2ft and some of the contours are less then a foot apart. 

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

Contour lines pose a challenge for spatial indexes because a single contour line tends to span very large distances so the efficacy of using spatial indexes is reduced.  I commonly take dense data sets like contour lines and break them up using a grid, e.g., USGS 1:24,000 topographic map index grid.  Splitting the contour lines allows for better spatial index efficacy and quicker draw times when zoomed into small areas.  For analysis, some of how contours are selected might need to be modified, but results are not impacted once the selections are made correctly.

SteveLynch
Esri Regular Contributor

to add to what Joshua says about splitting up the long contours...

We added a optional parameter to the Contour tool, viz., Maximum vertices per feature.

George_Thompson
Esri Frequent Contributor

My guess is that the SQL Server instance cannot handle that. If they are for visualization only I would do the following.

- Keep an original copy of the contours in the GDB for reference if needed or a fGDB that is backed up for safe keeping.

- Create a generalized copy of the contours. You will need to determine the tolerance that is ok with you. Generalize—Help | Documentation 

- Retest after the above step.

As for your DBA being worried about disk space, that is a weird concern unless they have no more for the the machine .

Are you connecting to the SQL Server DB from ArcGIS Pro on your local machine or is the machine in your office?

If you are trying to pull all the data across a VPN or the internet, I could see some issues.

--- George T.
0 Kudos
MarceloMarques
Esri Regular Contributor

yes, it is expected and known the data will take more disk space when it is loaded into SDE. You can improve performance using SQL Server Page Compression, create the empty featureclass schema, enable page compression on the tables and indexes, then load the data using ArcCatalog Simple Data Loader, also you can have a custom SDE DBTUNE to separate tables and indexes into different filegroups, large featureclasses like "contours" can have its own SDE DBTUNE keyword with its own filegroups/datafiles that are completely separate from the rest of the data, you still need to tune and maybe rebuild the spatial index after you load the data to make sure you get the best drawing performance possible, another thing to consider is table partitioning and index partitioning, you can implement SQL Server partitioning to improve performance, also when loading data on large featureclasses if the datafiles are in autogrow then this will cause lots of fragmentation, then consider to run dbcc shrink database, or resize the datafiles before you load the data to avoid heavy fragmentation, now all these best practices will help you improve disk I/O, but still at the end of the day your database storage plays a crucial role in I/O, you might find that even after implementing the best practices that the storage does not scale up and you need to consider a faster storage for your database datafiles, this can be determined if you run SQL Profile traces when the data is drawing in ArcGIS to find the slow queries and by analyzing the SQL Query PLAN you can get more info about the storage performance. Also the version of SQL Server plays an important role in performance, if you are using SQL Express then you cannot expect it to scale up with very large featureclasses, and the advanced features like SQL Server Page Compression requires SQL Server Enterprise Edition. I hope this helps. 

| Marcelo Marques | Principal Product Engineer | Esri |
| Cloud & Database Administrator | OCP - Oracle Certified Professional |
I work with Enterprise Geodatabases since 1997.
“ I do not fear computers. I fear the lack of them." Isaac Isimov