AnsweredAssumed Answered

Performance enhancement options for querying file geo-database

Question asked by peterlen on Sep 25, 2014
Latest reply on Sep 29, 2014 by jamesfreddyc

Hello - We are using ArcGIS 10.0 (Python 2.6).  We are using a file geo-database which has 400+ feature classes (polygon features) which contain a total of about 400,000 features.  Some feature classes only have a few features while other have 10s of thousands.  The UI allows the user to selected an AOI extent.  On the server, we are using arcpy to perform an INTERSECTS search using the AOI extent.  How it works is that it goes though a loop for each of the 400+ feature classes and performs the INTERSECTS search.  This can take a couple of minutes to run.  I am wondering if there is a more efficient way to perform the search.  I am not sure if it just the case where you have to run the search against each of the feature classes, one at a time.  Initially I was thinking that there may be a way to perform a single seach over all of the feature classes, but was told that option was not really feasible.  So the first question is whether or not I simply have to stick with doing 400+ individual searches.  If the answer is yes, then I am wondering if the manner in which it is being done is at least efficient.  The process for EACH feature class is:

 

feature_layer = "in_memory\\" + layer_name

fc_path = main_gdb + "/" + layer_name

arcpy.MakeFeatureLayer_management(fc_path, feature_layer)

arcpy.SelectLayerByLocation_management(feature_layer, "INTERSECT", aoi, "", "NEW_SELECTION")

 

It may be that all is the best it can be.  I am just hoping that there is way to speed up the searches in some manner.

 

Any thoughts would be greatly appreciated - Peter

Outcomes