I created a model to update a feature class (AGS 10.2.2) which runs in about 3 minutes to deal with 38,000 records.
When I call the model from Python (ArcGISx6410.2\python.exe) the job takes nearly 3 hours.
Any suggestions as to what I'm possibly missing will be greatly appreciated!
Where is the data stored (Shapefile, personal gdb, file gdb, enterprise geodatabase)?
Where is this data stored relative to your machine (local to the machine, on a remote server, etc.)
What are you updating within the data?
How are you updating the data in ModelBuilder vs Python?
Thanks for the reply!
I have a large collection of scripts that refresh publishing databases for our web mapping from the various user databases where edits occur. This scripts were based upon (mainly) SDE command line tools.
Over the past month, I’ve been recreating them in model builder and either saving as a python script or running the model from python (in cases where I have “In Memory” features such as route events and do not want to create yet another temp feature in my SQL Server database).
Most of them have worked really well, But I have one where I pull x/y data from another SQL Server system completely and created an X/Y feature class. Then I delete rows from my target feature class and append the result from the “Make X/Y” operation.
SQL Server 2012 / ArcGIS 10.2.2