Select to view content in your preferred language

BATCH combining Filedatabases

1059
5
08-21-2013 07:32 AM
RobertBorchert
Honored Contributor
I created a script to ftp some 3000 geodatabases.  I unzipped them in batch from a DOS Prompt.

Each Filedatabase contains a tile that contains contours, dems, buildings etc that are built from LIDAR data.

What I am looking for a is a method to combine in bulk the contours from the multiple filedatabases without resorting to using the data loader on each contour set.
0 Kudos
5 Replies
VinceAngelo
Esri Esteemed Contributor
What are you combining these file geodatabases into?  Another FGDB?  How large
are you expecting the resulting table (in rows and Gb)?  Are you expecting to join
the features at tile boundaries?  Is there a naming system which will allow you to
know the tile location without polling the features?

Python can of course be used to append feature classes, but with the volume involved,
you'll probably need a mechanism that will permit interruption and resumption of the
load cascade process.

Looks like a TANSTAAFL situation to me.

- V
0 Kudos
RobertBorchert
Honored Contributor
Thanks for the Heinlein quote.

The tiles each have their own unique name.  Each filegeodatabase has a common model structure.

Unfortunately I am not Python savvy.

I would like to load them into either file or personal db which may eventually end up in SDE

My recourse in the past for combining multiple sources into one has been the simple copy/paste.

What are you combining these file geodatabases into?  Another FGDB?  How large
are you expecting the resulting table (in rows and Gb)?  Are you expecting to join
the features at tile boundaries?  Is there a naming system which will allow you to
know the tile location without polling the features?

Python can of course be used to append feature classes, but with the volume involved,
you'll probably need a mechanism that will permit interruption and resumption of the
load cascade process.

Looks like a TANSTAAFL situation to me.

- V
0 Kudos
VinceAngelo
Esri Esteemed Contributor
Which would you rather learn, Python, C#, or C++?  Your only batch options
are ArcPy, ArcObjects, and FGDB API (the latter two would require hundreds
to thousands of lines of code, while Python would only require dozens).
It would only take a few hours to learn enough ArcPy to start this.

Don't bother with even looking at PGDB (size and compatibility).  I would
strongly discourage exceeding 20Gb in a a single FGDB table (too many
eggs in one basket).  It you're targeting ArcSDE, then don't bother merging
FGDBs; do try to load either across or up/down tiles, to reduce spatial
fragmentation.

I started writing a standalone tool to transfer FGDB tables to ArcSDE but
Microsoft  library compatibility issues (static v. dynamic) made for more
work than I was willing  to take on.  My fallback plan was to export FGDB
to ASCII, which then could be quickly bulk-loaded into ArcSDE with 'asc2sde',
but I haven't have time to  mess with that either.

- V
0 Kudos
RobertBorchert
Honored Contributor
I went with the low budget you just need a computer with grit method.

I dos prompt unziped all 2573 filegeodatabases. 

They are large because of the DEM's they contained.  I then created a new .gdb

I simply selected all the .gdb's in ArcCatalog.  right clicked and did an export multiple to geodatabase. 

This only exported the points, lines and polygons.  The resulting .gdb was surprisingly small. I did a test on 100 .gdb and it was only 70mb before I deleted out the other features I didn't need anymore. 

I am doing it 500 at a time.

No code, no dozens of hours trying to write something fancy.  Just simple creative use of the existing out of the box tools.




Which would you rather learn, Python, C#, or C++?  Your only batch options
are ArcPy, ArcObjects, and FGDB API (the latter two would require hundreds
to thousands of lines of code, while Python would only require dozens).
It would only take a few hours to learn enough ArcPy to start this.

Don't bother with even looking at PGDB (size and compatibility).  I would
strongly discourage exceeding 20Gb in a a single FGDB table (too many
eggs in one basket).  It you're targeting ArcSDE, then don't bother merging
FGDBs; do try to load either across or up/down tiles, to reduce spatial
fragmentation.

I started writing a standalone tool to transfer FGDB tables to ArcSDE but
Microsoft  library compatibility issues (static v. dynamic) made for more
work than I was willing  to take on.  My fallback plan was to export FGDB
to ASCII, which then could be quickly bulk-loaded into ArcSDE with 'asc2sde',
but I haven't have time to  mess with that either.

- V
0 Kudos
RobertBorchert
Honored Contributor
As a side note I was pretty good writing code in mapbasic in my MapInfo days.

I am going to start looking at learning Python.  Just seems to me to many times when it is handy.

Which would you rather learn, Python, C#, or C++?  Your only batch options
are ArcPy, ArcObjects, and FGDB API (the latter two would require hundreds
to thousands of lines of code, while Python would only require dozens).
It would only take a few hours to learn enough ArcPy to start this.

Don't bother with even looking at PGDB (size and compatibility).  I would
strongly discourage exceeding 20Gb in a a single FGDB table (too many
eggs in one basket).  It you're targeting ArcSDE, then don't bother merging
FGDBs; do try to load either across or up/down tiles, to reduce spatial
fragmentation.

I started writing a standalone tool to transfer FGDB tables to ArcSDE but
Microsoft  library compatibility issues (static v. dynamic) made for more
work than I was willing  to take on.  My fallback plan was to export FGDB
to ASCII, which then could be quickly bulk-loaded into ArcSDE with 'asc2sde',
but I haven't have time to  mess with that either.

- V
0 Kudos