The complex query in query layer

4487
11
Jump to solution
04-08-2016 09:25 AM
Chang-HengYang
New Contributor III

Hi all,

Thank you for your time in advance. I am wondering if any tool or layer in the ArcGIS could work with the SQL query directly. In my previous post, Mody Buchbinder mentioned the MakeQueryTable might work with the complex query listed below. However, it did not work. Now, I was told that the Query layer might work with the complex query. Or you would suggest to execute the complex query in the MSSQL server to get the data and export the data to ArcMap.

Let me know your valuable opinions. 

Thanks,

Hank

SELECT

        t.POINT_ID,

          t.CollectionDate,

          t.HistoricDate,

          t.U,

          t.LAT,

          t.LONG,

          t.WellDepth,

          t.DataSource

FROM

                      (SELECT POINT_ID, MAX(CollectionDate) AS MaxValue

                       FROM

                                    NM_Water_Quality.dbo.MASTER_UNION_Water_Quality

                       WHERE     

                                 (U IS NOT NULL) AND (U > 0)

                                 GROUP BY POINT_ID) AS x INNER JOIN

                                 NM_Water_Quality.dbo.MASTER_UNION_Water_Quality AS t ON x.POINT_ID = t.POINT_ID AND x.MaxValue = t.CollectionDate

WHERE     

            (t.U IS NOT NULL) AND (t.U > 0)

Tags (2)
11 Replies
VinceAngelo
Esri Esteemed Contributor

Cross-database queries usually perform poorly.  If you have spatial data somewhere, you'd be much better off accessing that data locally, or by establishing a process to replicate it (little 'r' "replicate", vice big 'R' "ArcGIS Geodatabase Replication") to the database to which you do have access.

- V

0 Kudos
DougGreen
Occasional Contributor II

@Chang-HengYang this has been a while but this use case is becoming more useful so I figured I'd share something we've learned. We do this sort of thing quite often and can validate what @VinceAngelo is saying about performance and replication etc. We have found a pattern that allows us to pull information from various instances and deliver a feature class with great performance. The one draw back is that it's not live (real-time views of the data in their tables). However, if your needs allow a regular update interval (daily or weekly etc) then give this a try.

 

  1. Create a feature class in your Enterprise GDB with the same schema as the query layer you're trying to make.
  2. Register it as versioned but indicate in metadata or other documentation that it is not to be edited by end users.
  3. Create a stored procedure that can be run by a SQL Agent Job on a schedule that does the following:
    1. Based on https://desktop.arcgis.com/en/arcmap/10.7/manage-data/using-sql-with-gdbs/edit-versioned-data-using-...
    2. Deletes all rows from the multi-version view (Same as the feature class name with '_evw' appended at the end)
    3. Inserts the results of your query into the multi-version view
  4. Running a compress on the database after these operations will help this perform better so that all of these new edits can be compressed into the base table.

NOTE: There is a way to do it without registering it as versioned but you have to do some extra steps to handle the managed OBJECTID field in a way that leaves it in a consistent state with the Geodatabase (https://desktop.arcgis.com/en/arcmap/10.7/manage-data/using-sql-with-gdbs/inserting-a-value-into-a-g...)

This allows the database to chug away at a hefty process when we don't have any users in the system and provide standard feature classes with spatial indexes to our users.