Performance enhancement options for querying file geo-database

620
1
09-25-2014 12:49 PM
PeterLen
Occasional Contributor

Hello - We are using ArcGIS 10.0 (Python 2.6).  We are using a file geo-database which has 400+ feature classes (polygon features) which contain a total of about 400,000 features.  Some feature classes only have a few features while other have 10s of thousands.  The UI allows the user to selected an AOI extent.  On the server, we are using arcpy to perform an INTERSECTS search using the AOI extent.  How it works is that it goes though a loop for each of the 400+ feature classes and performs the INTERSECTS search.  This can take a couple of minutes to run.  I am wondering if there is a more efficient way to perform the search.  I am not sure if it just the case where you have to run the search against each of the feature classes, one at a time.  Initially I was thinking that there may be a way to perform a single seach over all of the feature classes, but was told that option was not really feasible.  So the first question is whether or not I simply have to stick with doing 400+ individual searches.  If the answer is yes, then I am wondering if the manner in which it is being done is at least efficient.  The process for EACH feature class is:

feature_layer = "in_memory\\" + layer_name

fc_path = main_gdb + "/" + layer_name

arcpy.MakeFeatureLayer_management(fc_path, feature_layer)

arcpy.SelectLayerByLocation_management(feature_layer, "INTERSECT", aoi, "", "NEW_SELECTION")

It may be that all is the best it can be.  I am just hoping that there is way to speed up the searches in some manner.

Any thoughts would be greatly appreciated - Peter

0 Kudos
1 Reply
JamesCrandall
MVP Frequent Contributor

Peter,

One suggestion is to be sure to clean up the in_memory space prior to and post process run(s) to each iteration, otherwise you may end up consuming available RAM (I believe this is where in_memory is written to).  I include this def() in most of my tools and can just call it anytime/anywhere needed, especially if all your processing is intermediate data -- so you would need to write the "final" output to disk.


def clearINMEM():


  arcpy.env.workspace = r"IN_MEMORY"
     
  fcs = arcpy.ListFeatureClasses()
  tabs = arcpy.ListTables()
         
  ### for each FeatClass in the list of fcs's, delete it.
  for f in fcs:
    arcpy.Delete_management(f)
    arcpy.AddMessage("deleted: " + f)



  ### for each TableClass in the list of tab's, delete it.
  for t in tabs:
    arcpy.Delete_management(t)
    arcpy.AddMessage("deleted: " + t)

0 Kudos