Dynamically Rendering PostgreSQL Query Layer

Discussion created by jeadly on Mar 10, 2014
Latest reply on Mar 11, 2014 by vangelo-esristaff
I have an ArcGIS Server service driven by a Query Layer to join geometries with PostgreSQL table records.  The table contains one record per sex/race/age/site combination per US county.  The result is essentially 3,000 overlapping polygons per US county, which takes a long time (and rightfully so) to draw in ArcMap.  Here is an example of the Query Layer SQL statement:

[INDENT]SELECT round(dat.rate,1) as rate_rnd, shp.shape, dat.* FROM tblmortratescounty dat JOIN us_counties_mod shp on (shapefile_id = shapefileid) where icount>15[/INDENT]

The web application that leverages this service uses a Definition Query and Dynamic Layers to request a subset of (non-overlapping) records and classifies server-side them with a Class Breaks Renderer.  The result is thousands of dynamically generated maps, and performs reasonably at state level (one state at a time).  However at the US by County level, this map takes a long time (20-30 seconds) to draw.

I'm looking to decrease this draw time for a US by County map.  Does anyone have advice on Indexing or SQL  statement changes that might help?

Other observations:
When I perform a REST query on the layer for 3,000 records with all fields it returns quickly.  Likewise for the dynamic legend object.  The service seems to have no trouble accessing 3,000 records of data when not drawing a map.  I have tried similar services with a subset (30,000 records) that dynamically map US by County from a Feature Class vs Query Layer, and FC are much faster.  So the CBR on 3,000 polygons shouldn't be the problem.

The server does a spatial query first, before considering the Definition Query.  This doesn't reduce the evaluated records when mapping at the full extent, so it just wastes time.  Is there a way I can prioritize Attribute search over Spatial search?

Thanks in advance!