Do I need to rebuild attribute indices when using InsertCursor?

356
1
09-06-2019 12:00 PM
williamwinner
New Contributor III

Hello!  I have a series of tools that I have written for a python toolbox.  The idea is that we can take a variety of types of sources of point data and load it into a single geodatabase.  I have been using this pretty effectively up until recently and I'm not sure what is happening.  The only change I have made recently is to add an attribute index to the geodatabase for quicker searching.

The basics of the script is that it takes the most recent data and gets its source date and it's extents.  The extents are either provided or calculated using Bounding Geometry.  Once I have those two layers, it then queries the database first to find all data within the bounding polygon and then to find any of that selected data that is older.  It then deletes the older data within the bounding polygon.  It is these two steps where it is not working as expected.  Even when I know there is data within the bounding polygon, it is saying it didn't find any data in the Select By Location.

The script is written so that it can take a number of input files and iterate through them.  It seems to work for the first one but then returns no selection on following iterations.  The only thing that I can think is that when I have loaded in the previous data it has somehow messed up the index and destroyed future searches.

Any ideas on what to try would be helpful!

Here's some of the code:

In the execute portion:

# Get the SurveyJob file
fcSurveyJob = gdb + "\\SurveyJob" #this has the extents of the point data

if not arcpy.Exists(fcSurveyJob):
    arcpy.AddWarning("No SurveyJob feature class was found...skipping file...")
    intCurJob += 1
    continue # go to the next gdb

# Get the SurveyPoints file
fcSurveyPoint = gdb + "\\SurveyPoint"

if not arcpy.Exists(fcSurveyPoint):
     arcpy.AddWarning("No SurveyPoint feature class was found...skipping file...")
     intCurJob += 1
     continue # go to the next gdb

# Delete Points within SurveyJob that are older than current data 
sordat = helper.GetSORDATfromEhydro(fcSurveyJob) #SORDAT = Source Date as a Date type
if overallSORDAT < sordat: overallSORDAT = sordat

helper.DeletePointsWithinJob(fcSurveyJob, lyrUSACE, long(sordat.strftime("%Y%m%d")))‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍ # data is stored in the geodatabase as a long‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

# Import points
helper.ImportPoints(fileTemp, lyrUSACE, sorind)

and my helper class:

def DeletePointsWithinJob(self, TempBoundingPolygon, USACELyr, sordat):
    # Select OutputFC within current area
    arcpy.AddMessage("\n  -Deleting older points within area of new data...")
    arcpy.AddMessage("    -Selecting Features within bounding polygon " + TempBoundingPolygon + "...")
    arcpy.AddMessage("      -SORDAT=" + str(sordat))

    originalLyr = USACELyr
    try:
        USACELyr = arcpy.MakeFeatureLayer_management(USACELyr, "OutputLayer_lyr")
    #    arcpy.SelectLayerByLocation_management("OutputLayer_lyr", "WITHIN", eHydroSurveyJob)
    except:
        arcpy.AddMessage("Error")
        USACELyr = originalLyr

    arcpy.SelectLayerByLocation_management(USACELyr, "WITHIN", TempBoundingPolygon)
    arcpy.AddMessage("      -Found " + str(arcpy.GetCount_management(USACELyr).getOutput(0)) + " points within Bounding Polygon")

    # Select from that selection
    strSelection = "SORDAT < " + str(sordat)
    arcpy.SelectLayerByAttribute_management(USACELyr, "SUBSET_SELECTION", strSelection)
    arcpy.AddMessage("        -Selection string= " + strSelection)
    arcpy.AddMessage("      -Found " + str(arcpy.GetCount_management(USACELyr).getOutput(0)) + " older points")‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

    #... more code

def ImportPoints(self, eHydroFC, OutputFC, SORIND):
    arcpy.AddMessage("\n  -Copying data to " + OutputFC)
    arcpy.AddMessage("    -Setting SORIND to: " + SORIND)

    # open a search cursor on the file
    inputFields = ["SHAPE@XY","Z_depth", "SurveyDateStamp"]
    outputSR = arcpy.Describe(OutputFC).spatialReference
    inputRows = arcpy.da.SearchCursor(eHydroFC, inputFields, spatial_reference=outputSR)

    # get input cursor
    outputFields = ["SHAPE@XY", "DEPTH_M", "SORDAT", "SORIND"]
    outputRows = arcpy.da.InsertCursor(OutputFC, outputFields)

    # Set Progressor
    intCount = int(arcpy.GetCount_management(eHydroFC)[0])
    arcpy.AddMessage("    -Copying " + str(intCount) + " features.")
    newProgressor = MyProgressor("step", "Copying input features...",0, intCount)
    curCount = 0

    # Start copying
    for inputRow in inputRows:
        inputDate = inputRow[2].date()
        SORDAT = long(inputDate.strftime("%Y%m%d"))
        outputRows.insertRow((inputRow[0], (inputRow[1]/3.28084), SORDAT, SORIND))
        newProgressor.StepProgressor()
            
        curCount += 1
        if curCount % 1000 == 0:
            newProgressor.SetProgressorLabel("Copying input features...est. time remaining: " + newProgressor.EstimatedTimeRemaining)

    arcpy.AddMessage("\n  -Importing complete.")
    del inputRows
    del outputRows

    return

I'm deleting the insertcursors properly and this was working perfectly right up until I added the attribute index on SORDAT.  Do I need to rebuild the indices?  If so, there wouldn't be much point in having them as that would take longer each time than the search would without it.

Thanks!

python toolboxes‌

Tags (1)
0 Kudos
1 Reply
williamwinner
New Contributor III

Well, in my helper class, I commented out the section within the try that makes a new feature layer.  That was a bit of legacy code when the code was all in one file and when I separated it out into multiple files and added the helper class, I took care of the layer creation at the beginning so that was really just adding extra time.  

And, removing that seems to have fixed the issue although I don't understand why.  That try wasn't actually creating a new feature layer because it was already a feature layer.  I had an output at the bottom of that that showed me the name of USACELyr and it was never my temporary layer.  We'll see if it's still working properly next week I guess.

0 Kudos