AnsweredAssumed Answered

Exporting large geodatase tables produces corrupt files?

Question asked by rttrotter on Jan 27, 2014
Latest reply on Jan 27, 2014 by vangelo-esristaff
I am attempting to export the contents of a geodatabase table which includes four attributes/columns, and ~ 225 million records.  Unfortunatly, (in addition to being very slow, ~ 8 hours), the resulting output file is incomplete/corrupted.  I am able to read records in the file up to a point, beyond which the records are empty (though the file is still structured as though the total number of records is present).  I have tried exporting the table in smaller pieces, but with the same result.  The location in the file at which the records ends varies with each export (the maximum so far is ~64 million, the minimum has been ~28 million).  Does anyone know if:

1) There is an inherent limit in the number of records being exported (note, they are being exported to a text file, and are not being loaded into memory).

2) Are there alternate ways of exporting tables of this size?

Thanks!
Talbot

Outcomes