Just wondering what the best method is to copy non-versioned data from one database to another.
I'm using the following code:
for fc in fcList: arcpy.CopyFeatures_management
to iterate through all features in a database and copy them to another database, but often run into database locking issues. Thought about using the append tool to delete all features then load the new ones in, but the schemas must be the same. What is the most efficient way to copy data from one database to another?
I'm doing the data update from ArcSDE 9.1 (our main SDE oracle geodatabase) to 9.3 (our DMZ Oracle geodatabase just for web applications). Because I can not copy/append it from low precision geodatabase to high precision geodatabase, I do it in 2 steps by Python, first copy 9.1 SDE data to file geodatabase, and then use Append_management to update 9.3 SDE data from file geodatabase (after delete features in 9.3SDE database).
I'm doing the data update from ArcSDE 9.1 (our main SDE oracle geodatabase) to 9.3 (our DMZ Oracle geodatabase just for web applications). Because I can not copy/append it from low precision geodatabase to high precision geodatabase, I do it in 2 steps by Python, first copy 9.1 SDE data to file geodatabase, and then use Append_management to update 9.3 SDE data from file geodatabase (after delete features in 9.3SDE database).
I originally didn't want to go for NO_TEST as there may be schema changes in the data, but have ended up going for that option and seems to work well 🙂