Hi All,
I've been importing shapefiles into one geodatabase by way of a python script. The database is now 13gb, and even after compacting it, I cannot view or in any way work with the geodatabase, as I get "out of memory" errors. My computer has 8gb of RAM - are geodatabase sizes limited to the amount of installed memory? I am surprised that they do not operate using virtual memory (swap).
Examples of some of the error messages:
Not enough storage is available to complete this operation.; (code 8007000E)
Out of memory
Failed to execute CompressFileGeodatabaseData
Thanks for the insights,