I'm working on a project where we are collecting data out in the field and downloading it from ArcGIS online. We have 4 different apps from where we download data and put them into 4 different geodatabases. The four databases are: Elec, Site, Mech, and Stru. I need to do weekly updates and join the data that's collected in the field to the tables we have in our master database. There are 40 different tables that need to be updated. I'm kind of a beginner in python but what I need to know is how to batch append records from multiple tables in multiple databases to a single database. I've tried to use python and model builder to do this, but the model is massive and there are too many parameters. I also have to run the model 4 times for each database we download the data from.
I've started the script but am running into an error where the schema's aren't matching. I think what is happening is I'm not able to match the names of the items in two different lists. The lists (inputTablesJoin and outputTables) are identical and each matching table name has the same schema too. Here's my code:
import arcpy import os workspace = "C:\data\Pump_Station_Inspections" inputTables =  ## Walk through each folder containing the downloaded data and find tables (Elec, Site, Mech, and Stru) walk = arcpy.da.Walk(workspace, datatype = "Table") inputTablesJoin =  ## Append tables to list for dirpath, dirnames, filenames in walk: for filename in filenames: inputTables.append(filename) inputTables.sort() ## Create a list of tables from the master geodatabase arcpy.env.workspace = "C:\data\Pump_Station_Inspections\InspectionsGISData.gdb" outputTables = arcpy.ListTables() outputTables.sort() for inputTable in inputTables: if inputTable in outputTables: inputTablesJoin.append(inputTable) for table in inputTablesJoin: arcpy.Append_management(table, outputTables)
Any help will be much appreciated. Thanks!!