Select to view content in your preferred language

Memory limits of file geodatabase?

706
5
11-23-2010 10:59 AM
MattRosales
Deactivated User
Hi All,

I've been importing shapefiles into one geodatabase by way of a python script. The database is now 13gb, and even after compacting it, I cannot view or in any way work with the geodatabase, as I get "out of memory" errors. My computer has 8gb of RAM - are geodatabase sizes limited to the amount of installed memory? I am surprised that they do not operate using virtual memory (swap).

Examples of some of the error messages:

Not enough storage is available to complete this operation.; (code 8007000E)
Out of memory
Failed to execute CompressFileGeodatabaseData

Thanks for the insights,
0 Kudos
5 Replies
HelenYang
Emerging Contributor
Instead of trying to compress the whole gdb, you may try to compress individual feature dataset or feature class.
0 Kudos
MattRosales
Deactivated User
Thanks for the input - unfortunately I can't list the contents of the geodatabase without incurring an out of memory error. The strangest facet of this is arcpy.ListFeatureClasses() returns null, but if I already know the name of a feature class, it can be manipulated, but without being able to list them, I don't know the individual names of the feature classes to compress... arcpy.ListFiles() crashed the last two times I attempted it also. Ugh!
0 Kudos
VinceAngelo
Esri Esteemed Contributor
It sounds like your FGDB was corrupted, and the compress made things worse.

- V
0 Kudos
MattRosales
Deactivated User
Is there any way to repair a FGDB? Or in any case get some sort of diagnostics out of it to know for sure?
Thanks,
0 Kudos
VinceAngelo
Esri Esteemed Contributor
Contact Tech Support; they're the only folks who might have tools like that.

- V
0 Kudos