We currently have approximately 1200 file geodatabases, one for each build site, that all have the same schema and an SDE that is a compilation of all of those individual sites. Right now we have to go into the SDE and manually update records according to what we changed in the file geodatabases. Does anyone know of a way that we could have these updates made automatically, preferably with a script that we could run on a nightly basis to keep our data up to date?
pretty basic question to hope for a detailed answer depending on your actual scenario. You really don't explain what is meant by "update records according to what we changed"
If you just want the SDE data to be the "same" as the FGDB FC's, one way it to loop through and truncate the SDE FC's (which will remove all data, but keep schema), then append the FGDB FC's into the SDE datasets.
Then, your SDE data will represent any and all "changes" to the FC's.
There are several ways to do this.
Rhett Zufelt is correct by saying a script could be created to truncate the data in the enterprise geodatabase and then append the data from the file geodatabases to the enterprise geodatabase.
Another option that jumps to my mind (if the data is registered as versioned) is creating Check Out replicas.
Replica file geodatabases can be created off of the enterprise geodatabase. Then edits can occur in those file geodatabases. After the edits are done, the Synchronize Changes tool can be executed to move the edits from the file geodatabases to the enterprise geodatabase.
If you aren't familiar with replicas and versioning, then doing the truncate and append should probably address your needs.
All these options can be accomplished by creating a Python script that uses the geoprocessing ArcPy functions or using Model Builder.