Exporting large geodatase tables produces corrupt files?

1294
5
Jump to solution
01-27-2014 10:07 AM
RobertTrotter
New Contributor III
I am attempting to export the contents of a geodatabase table which includes four attributes/columns, and ~ 225 million records.  Unfortunatly, (in addition to being very slow, ~ 8 hours), the resulting output file is incomplete/corrupted.  I am able to read records in the file up to a point, beyond which the records are empty (though the file is still structured as though the total number of records is present).  I have tried exporting the table in smaller pieces, but with the same result.  The location in the file at which the records ends varies with each export (the maximum so far is ~64 million, the minimum has been ~28 million).  Does anyone know if:

1) There is an inherent limit in the number of records being exported (note, they are being exported to a text file, and are not being loaded into memory).

2) Are there alternate ways of exporting tables of this size?

Thanks!
Talbot
0 Kudos
1 Solution

Accepted Solutions
RobertTrotter
New Contributor III

Hi All, I did not find a resolution to the corruption problem, but I changed to exporting data in smaller sets, this seems to have sped the process up, and avoids the overflow problem (assuming that is what caused the problem).  Thanks!

View solution in original post

0 Kudos
5 Replies
WilliamCraft
MVP Regular Contributor
Do you have a 32-bit OS or a 64-bit OS?  I seem to recall a file size limit where corruption eventually occurs on a 32-bit system.  How large is the final output once the process completes?
0 Kudos
VinceAngelo
Esri Esteemed Contributor
Eight hours isn't a particularly long time to complete the export of a quarter-billion rows.

We would need details on how you are going about the export to provide advice on other
possibilities.  Be sure to include the exact version of ArcGIS, the exact RDBMS source, and
the code you are using for export.

- V
NimaNattagh
New Contributor

Robert , did you get the issue resolved?

0 Kudos
RobertTrotter
New Contributor III

Hi All, I did not find a resolution to the corruption problem, but I changed to exporting data in smaller sets, this seems to have sped the process up, and avoids the overflow problem (assuming that is what caused the problem).  Thanks!

0 Kudos
RobertTrotter
New Contributor III

As an update (this is an old thread, but in case it's ever useful):  Subsetting the process by breaking it up did seem to resolve the issue, so I am assuming at this point that the problem was in the architecture of the operating system (i.e. exceeding memory or the limits of a 64-bit system).  

0 Kudos