Migrate from FGDB to SDE

1063
6
03-13-2023 12:39 AM
Status: Closed
Labels (1)
MarGIS
by
New Contributor III

 I am in a situation to migrate huge database that contain more than 6 million records with tens of feature datasets and more than 800 feature classes. I wish if there is a tool that could migrate data from FGDB to SDE other than import XML as it is not recommended by Esri specially for large databases.

6 Comments
George_Thompson

I would say the safest is copy / paste. You could use the Export Feature GP tool to batch this process from the fGDB --> EGDB.

MarceloMarques

@MarGIS 

I have been using the approach below for over 25 years, since the geodatabase was first released back in 1998, this is my preferred method when the databasets are quite large.

1. export xml schema only - no data

2. import xml schema empty into the Enterprise Geodatabase ArcSDE

3. use the append data geoprocessing tool to load the large featureclasses

4. consider using ArcGIS Model Builder, ArcPy Python Scripts, ArcGIS Data Interoperability Extension or FME Workbench to automate the data load process if you have a very large schema with a large number of featureclasses.

The ArcGIS copy/paste from FGDB to SDE works well for small datasets, but for large datasets the better approach is to create the empty schema first, then prepare the RDBMS database for large data loads (e.g disable database archiving, dropping the featureclass spatial index before the load, etc.), perform the data load, then rebuild indexes ( attribute indexes and spatial indexes ), gather new statistics, and then prepare the RDBMS database for normal OLTP editing.

For More Best Practices to load large datasets visit the community.esri.com links below.

How Load Large Featureclass SQL Server Geodatabase

How Load Large Featureclass Oracle Geodatabase

Need more tips on how to setup the Enterprise Geodatabase ArcSDE.

Mapping and Charting Solutions (MCS) Enterprise Databases Best Practices

Tip: read the Production Mapping Database Guide Books

You can see the steps to prepare the RDBMS database for a large data load on my database template scripts as well.

Mapping and Charting Solutions (MCS) Enterprise Da... - Esri Community - Database Template Scripts

I hope this helps.

VinceAngelo

If appending large numbers of rows to existing feature classes, make sure you drop the spatial and non-critical attribute indexes before starting the Append -- This could make an order of magnitude difference in load performance. You'll need to build those indexes afterwards. The threshold for Append vs. DropIndex/Append/BuildIndex is only 10,000 features.

ShannonShields

6 million features is not especially large, but 800 separate feature classes is. Do you have a lot of relationship classes? Domains? More complex datasets like topologies?

Given the 800 separate feature classes I'd be reluctant to recommend importing empty schema and then appending data - that is a lot of separate append operations to coordinate.

If you have a lot of 'behavior' like domains & relationship classes then they will get preserved and imported along with the data if you copy and paste directly from one workspace into the other. Using a tool like Export will not preserve any of the behavior, and will only bring across data.

Have you tested any of these methods yet to determine if your data is actually too large?

MarGIS
by

@George_Thompson @MarceloMarques @VinceAngelo @ShannonShields Thank you so much for your great inputs, I do appreciate your help. I really love this community that support users in their work. For my case I went through (copy and paste) method, it took some time, but it worked fine. I will try the other methods in the staging environment. Thank you all so much for your help.

SSWoodward
Status changed to: Closed