It is 2015 now, and shape file still have a file size limit at 2GB? Can anyone please help me out of this?
It's still 2GB. Shapefiles have fallen out of favor as a data storage model over the last several years, being replaced essentially by geodatabases, so I suspect there is nothing driving a move to increase its storage capacity - at least from ESRI's perspective.
Chris Donohue, GISP
Shapefiles are still widely in use and will continue to be so until geodatabases or its replacement becomes open source. 2 Gig is not too onerous a limit and most people will never reach it in their normal work. What a gdb offers beyond a fancy shell to a zip folder structure is what you should really be considering. If you require topology and the like, then you should use one, otherwise, if work with open source software then you will need to maintain parallel sets.
Unfortunately, no. Shape files have this limit due to the limitations of the dBase file format (which isn't ESRI's fault):
Your alternative would have to be a file GeoDatabase if you're looking for something non-server related.
As Steve mentioned the shapefile limit is due to the dBase file format. Unless you need interoperability with non-ESRI software I would use a file geodatabase.
The file geodatabase was a new geodatabase type released in ArcGIS 9.2. Its goals are to do the following:Provide a widely available, simple, and scalable geodatabase solution for all users.Provide a portable geodatabase that works across operating systems.Scale up to handle very large datasets.Provide excellent performance and scalability, for example, to support individual datasets containing well over 300 million features and datasets that can scale beyond 500 GB per file with very fast performance.Use an efficient data structure that is optimized for performance and storage. File geodatabases use about one-third of the feature geometry storage required by shapefiles and personal geodatabases. File geodatabases also allow users to compress vector data to a read-only format to reduce storage requirements even further.Outperform shapefiles for operations involving attributes and scale the data size limits way beyond shapefile limits.
The file geodatabase was a new geodatabase type released in ArcGIS 9.2. Its goals are to do the following:
From: ArcGIS Desktop
File geodatabase size and name limits are as follows:File geodatabase size: No limitTable or feature class size: 1 TB (default), 4 GB or 256 TB with keywordNumber of feature classes and tables: 2,147,483,647Number of fields in a feature class or table: 65,534Number of rows in a feature class or table: 2,147,483,647Geodatabase name length: Number of characters the operating system allows in a folder nameFeature class or table name length: 160 charactersField name length: 64 charactersText field width: 2,147,483,647
File geodatabase size and name limits are as follows:
File geodatabase size: No limit
Table or feature class size: 1 TB (default), 4 GB or 256 TB with keyword
Number of feature classes and tables: 2,147,483,647
Number of fields in a feature class or table: 65,534
Number of rows in a feature class or table: 2,147,483,647
Geodatabase name length: Number of characters the operating system allows in a folder name
Feature class or table name length: 160 characters
Field name length: 64 characters
Text field width: 2,147,483,647
Although this is an old thread, I do want to say that I resent the artificial limits on shapefile (parts).
There are limits dictated by the formats which have to be respected, but ithey are much higher than 2GB.
Let me explain.
A) 2GB is related to using old style fseek (32 bits signed offset in view of the options).
It can be extended by using the 64 bit variant of fseek, fseeko64(), for windows _fseeki64().
It can be used in a 32 bits program as well as in a 64 bits program.
B) .dbf files do not have a specific size limit, other than a maximum number of records, again 32 bits, so maximum is 4294967295. I can live with that. Since record size is not specifically limited, file sizes potentially are really big.
C) .shp files have 32 bit indices with a resolution of a word (2 bytes). an index of 1 means an offset (and potential size) of 2, and so on. The sequence of offsets and sizes is put in the .shx file, which does not impose additional limits.The index number is needlessly interpreted as a signed integer (there are no negative indices). Leaving that as it is, shapefile range would be 4 GB, not 2 GB. And as you will guess, if you interpret the indexnumber as an unsigned integer, the limit goes up to 8GB. I can live with that as well.
Really, I've implemented this and it works perfectly. E.g. Qgis does not either impose the same restrictive, not to say oppresive limits. It makes arcgis really the odd one out. Since shapefiles are still an important exchange format, I think Esri should revise its tech-docs to allow their full potential.
Throttling things is never a good form of advertising.
So, if you got as far as this, thanks for allowing me a rant that I have wanted to make for several years now.
I clipped some contours from LiDAR that were in a geopackage (which made QGIS the obvious choice for this given the lack of full support in ArcGIS for them) and saved the temporary layer to a shapefile hoping the extent and reduced fields would keep it under 2 GB because I knew that was the limit, but figuring it wouldn't work. It didn't get under the limit; the shp alone was over 2 GB. But they drew just fine in QGIS. I was surprised and brought it into ArcMap and it drew roughly 3/4 before stopping with a drawing error message. So I clipped some others with one being over 3.4 GB total and they all drew just fine in QGIS. So I tried drawing it in Pro and it didn't give me a drawing error, and it said its table had the same number of records as in QGIS, but Pro did stop drawing roughly 3/4 of the way through. So I ran a search and came up with this thread. Thanks for the post Jan.
Retrieving data ...