Hello Pro users!
This survey is designed to gather feedback for "Apply Symbology From Layer" GP tool. Please share your ideas and opinions, as they are invaluable to us. Thanks!
Is there any news to share about development around this tool?
This tool is absolutely vital for a custom written geoprocessing toolbox that I have, that extensively uses Python scripting.
Unfortunately, I have seen deplorable performance with this tool when using ArcGIS Query Layers using ultra large datasets (>100M records!) stored in an ordinary PostgreSQL / PostGIS database (non-enterprise geodatabase enabled!).
E.g. I have seen it take >48 hours(!) to update symbology for a singly Query Layer using this tool. This totally wrecks the performance of my tools.
I have absolutely no idea why this tool is so slow in these situations (note the underlying database and table is fully indexed), other then a hunch it is doing some very inefficient full table scan involving terribly slow cursors and inappropriate cursor handling for this size and type of data set.
Very unfortunately, despite extensive reviewing of arcpy options, I haven't found any realistic alternative for my custom tool and workflow. E.g. due to the specific processing and highly dynamic processing flow, I absolutely cannot use something like a script tool output parameter's "Symbology" option, because my tool generates a large number of dynamically generated layers, not a fixed set of output parameter feature layers that could have symbology set via the tool parameter's "Symbology" option.
There is also a large difference in the performance when having different type of input layers for the tool.
E.g., using a 390M record dataset:
- It takes 2 minutes to run the "Apply Symbology From Layer" tool when the data source of the feature layer in Pro is a SQLite based ESRI "Mobile Geodatabase" feature class
- It takes about 3 hours to run the "Apply Symbology From Layer" tool when the feature layer in Pro is a Query Layer using a SQL statement referencing a dedicated table stored in PostgreSQL / PostGIS containing only those records used in the Query Layer, and the SQL statement is therefor of the type "SELECT * FROM <TABLE_NAME>" without a WHERE clause.
- It takes >48 hours to run the "Apply Symbology From Layer" tool when the feature layer in Pro is a Query Layer using a SQL statement referencing a generic table stored in PostgreSQL / PostGIS containing those records used in the Query Layer but also others, and the SQL statement is therefor of the type "SELECT * FROM <TABLE_NAME> WHERE x=y", so including a WHERE clause.
In all of the above cases, the data is actually the same (buildings from OpenStreetMap), just stored differently.
As said, I don't understand why this tool needs to take so much time. It just needs to swap some symbols for the layer's symbology (note that I am explicitly using the 'update_symbology="MAINTAIN"' option in my arcpy code), and I would expect this to be near instantaneous.
I actually now managed to workaround this performance issue of the "Apply Symbology From Layer" related to "Query Layers".
I have actually been contemplating options for a significant amount of time, but overlooked a relatively straightforward solution that works for my specific case: temporarily rename the original table in the database just before running the "Apply Symbology From Layer" tool, create a database view with only a single record of this renamed table using the original table's name (I use a selection with MAX on the objectid column in the SQL), and subsequently run the "Apply Symbology From Layer" tool against the layer now referencing the single record database view. This is very fast and makes the tool's performance independent of the underlying table size of the original table. After running the tool, I delete the view, and rename the secondary table back to the original table's name.
Fortunately, a Query Layer already in the TOC doesn't seem to be affected by temporarily knocking out its datasource, so this rather drastic measure doesn't seem to have negative side effects.
It would still be nice though if the original issue is fixed.