Select to view content in your preferred language

imported Feature Class size

1778
15
Jump to solution
10-30-2013 11:46 AM
DanielSmith
Frequent Contributor
Hello,

I have a contour feature class stored in a file geodatabase. when looking at the contents tab of the geodatabase in ArcCatalog i see that the size of the file is 4GB.

I am attempting to store this data in an SDE database (SQL Server Express 2008 R2, so a 10GB limit). When I import the data (either using feature class to feature class, right-click > Import, or creating a blank feature class and loading it) the mdf file bloats to 10 GB and it fails to import.

Why does this 4 GB file bloat and blow up my SDE database? is there a way i can import this contour data set into my SDE instance?

Any thoughts, suggestion, or guidance is greatly appreciated.

-Daniel
0 Kudos
15 Replies
DanielSmith
Frequent Contributor
V,

Thank you, that worked pretty well and the feature was loaded and has a much smaller impact on the database size.

the contour data that we have was generated by our vendor from a LiDAR point cloud. I'm gonna check on the accuracy of their collection to better justify the decreased resolution of the coordinate system. I am sure that they are no where near the default accuracy so reducing the resolution of the storage should just be a matter of striking a happy medium between collection precision, contour accuracy, and coordinate system resolution.
0 Kudos
MarcoBoeringa
MVP Alum
I have a contour feature class stored in a file geodatabase. when looking at the contents tab of the geodatabase in ArcCatalog i see that the size of the file is 4GB.


the contour data that we have was generated by our vendor from a LiDAR point cloud. I'm gonna check on the accuracy of their collection to better justify the decreased resolution of the coordinate system.


That is a rather huge vector Feature Class by any standard. You really have to wonder, especially if this contour data was auto-generated from a highly detailed raster DEM or LIDAR point cloud, if there isn't a huge amount of bloat in this dataset in terms of excess polyline vertices. Usually, with auto-generated contours, a generalization step is in order and can often reduce storage constraints considerably, while maintaining the basic quality of the dataset. There is often way to much detail and vertices added to the contours in the contour generation process.

Generalization will probably also mean you don't have to reduce storage precision, while still achieving a huge reduction in dataset size.

hahahah. sorry. It would be Enterprise, licensed through ArcGIS for Server.


I really think you should be looking at getting some sort of "enterprise" type license for your SQL Server as well. It is a waist of money to have an Enterprise license for ArcGIS for Server, while using SQL Server Express as the back-end.
0 Kudos
DanielSmith
Frequent Contributor
That is a rather huge vector Feature Class by any standard. You really have to wonder, especially if this contour data was auto-generated from a highly detailed raster DEM or LIDAR point cloud, if there isn't a huge amount of bloat in this dataset in terms of excess polyline vertices. Usually, with auto-generated contours, a generalization step is in order and can often reduce storage constraints considerably, while maintaining the basic quality of the dataset. There is often way to much detail and vertices added to the contours in the contour generation process.

Generalization will probably also mean you don't have to reduce storage precision, while still achieving a huge reduction in dataset size.


yes, i agree. I had tested generalization using bend simplification and point remove and the results were really great, bend simplify had what i considered to be better results. much faster and much smaller in the end.

unfortunitely the powers that be did not want the data manipulated from the what was delivered. i will have to bring this up with them again. Somewhere accuracy is going to be lost (from the original delivery) if they want it in SDE. may it be in the resolution of the coordinates or in the reduction of verticies.
0 Kudos
MarcoBoeringa
MVP Alum
unfortunitely the powers that be did not want the data manipulated from the what was delivered. i will have to bring this up with them again. Somewhere accuracy is going to be lost (from the original delivery) if they want it in SDE. may it be in the resolution of the coordinates or in the reduction of verticies.


I can think of no reason not to generalize contours from a (LIDAR) DEM or point cloud. If you want the accuracy of the original data, start using the original data...

I see little value in terms of analytics, or scientific analysis, in height contours putting such severe constraints on accuracy that generalization would be out of the order.

Height contours are usually just cartographic (but maybe there's the culprit, and your customer is a cartographic agency afraid for quality loss compared to traditional "hand-drawn" photogrammetric contours...)

If the latter is the case, it maybe of help to you to hear that here in the Netherlands, the major cartographic agency responsible for country wide topographic maps (the "Kadaster"), actually just managed a unique achievement, that even got them an ESRI "Special Achievement in GIS" award: implementation of a fully(!) automated cartographic generalization process based on a huge - 400+ models - custom build ArcGIS Modelbuilder / FME suite that creates 1:50.000 maps from 1:10.000 without human intervention.

This seems a world's first and quite a colossal achievement, especially since the 1:50.000 auto-generated map is fully replacing the existing "manually generalize"  production line for 1:50.000. There is no other cartographic agency who seems to have achieved this yet, although many have research projects in this direction.

A Dutch article about this:
http://www.gdmc.nl/publications/2012/Automatische_generalisatie.pdf

And from an online publisher an English language article in "Cartography and Geographic Information Science":
http://www.tandfonline.com/doi/abs/10.1080/15230406.2013.824637#.UnK83-L27zs
0 Kudos
DanielSmith
Frequent Contributor
I can think of no reason not to generalize contours from a (LIDAR) DEM or point cloud. If you want the accuracy of the original data, start using the original data...

i could not agree more! I continually seem to deal with folks' inclination that GIS data, no matter the source, IS correctly and accurately reflecting reality and not a just a representation.

I see little value in terms of analytics, or scientific analysis, in height contours putting such severe constraints on accuracy that generalization would be out of the order.

Height contours are usually just cartographic (but maybe there's the culprit, and your customer is a cartographic agency afraid for quality loss compared to traditional "hand-drawn" photogrammetric contours...)

This is exactally the point i have tried to make repeatidly, though no where near as elegant. And no, the end users are not cartographers by any means. Rather more consultants.

If the latter is the case, it maybe of help to you to hear that here in the Netherlands, the major cartographic agency responsible for country wide topographic maps (the "Kadaster"), actually just managed a unique achievement, that even got them an ESRI "Special Achievement in GIS" award: implementation of a fully(!) automated cartographic generalization process based on a huge - 400+ models - custom build ArcGIS Modelbuilder / FME suite that creates 1:50.000 maps from 1:10.000 without human intervention.

This seems a world's first and quite a colossal achievement, especially since the 1:50.000 auto-generated map is fully replacing the existing "manually generalize"  production line for 1:50.000. There is no other cartographic agency who seems to have achieved this yet, although many have research projects in this direction.

A Dutch article about this:
http://www.gdmc.nl/publications/2012/Automatische_generalisatie.pdf

And from an online publisher an English language article in "Cartography and Geographic Information Science":
http://www.tandfonline.com/doi/abs/10.1080/15230406.2013.824637#.UnK83-L27zs


So you're pretty much my new hero for thorough post replies now. If you were invloved in the above, kudos to you! That is a pretty amazing achievment.

Cheers,

Update: for anyone reading this far vangelo answered my question, but mboeringa2010 solved my problem! MANY THANKS TO BOTH OF THESE USERS.
0 Kudos
MarcoBoeringa
MVP Alum
So you're pretty much my new hero for thorough post replies now. If you were invloved in the above, kudos to you! That is a pretty amazing achievment.


No, I wasn't involved in this particular project. I just thought it worth mentioning, as I just discovered these articles myself (I wasn't aware of this ongoing project), and thought it worth mentioning as being quite a remarkable achievement.

I have in the past been involved in other major projects here in the Netherlands for the Ministry of Transport, including a country wide highly detailed (about 1:10.000 scale) traffic noise modelling program and database implemented using ArcGIS, geodatabase versioning and a custom 3D traffic noise modelling core build by our project partner.
0 Kudos