Select to view content in your preferred language

Mystery locks on feature classes

6394
10
06-11-2013 12:49 PM
KerryAlley
Frequent Contributor
I have a short geoprocessing script with an arcpy.Delete_management() line that keeps throwing a lock error:  "ERROR 000464: Cannot get exclusive schema lock.  Either being edited or in use by another application."

Details: (code pasted below too)
I can successfully delete "lrs_buffer" if I skip the lines of code defining the mxd and the layers.  However, I can't think of why there would be a lock on "lrs_buffer" due to creating the map document object, since "lrs_buffer" had never been added to the mxd.  Also, if I'm using a file geodatabase as a workspace and observe through Windows Explorer, no lock files for "lrs_buffer" are created in this workflow.  I am having the same problem if I use a personal geodatabase as my workspace.

import arcpy
arcpy.env.overwrite = True
workspace = r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPointsQAQC.gdb"
arcpy.mapping.MapDocument(r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPoints.mxd")

rtlogpts = arcpy.mapping.ListLayers(mxd, "*rtlogpts")[0] #feature class on SDE
lrs = arcpy.mapping.ListLayers(mxd, "*lrs_route_twn")[0] #feature class on SDE
rdsmall_copy = arcpy.mapping.ListLayers(mxd, "rdsmall_arc")[0] #feature class in workspace

bufferFC = "lrs_buffer"
if arcpy.Exists(bufferFC):
    arcpy.Delete_management(bufferFC)
arcpy.Buffer_analysis(lrs.dataSource, bufferFC, "2 Meters", dissolve_option = "ALL")


Any ideas about what is happening?

Thanks!
Kerry
Tags (2)
0 Kudos
10 Replies
ChrisPedrezuela
Frequent Contributor
Hi guys,

I was doing a test. Created a simple script that creates 3 blank FCs in a FGDB then delete them right after. I made these short codes to loop several times within the same FGDB and at some point, the schema lock or workspace read only error appeared.

Here is the simple script I was running,

import arcpy

filenames = ['data1', 'data2', 'data3']

def createFiles():
    for filename in filenames:
        print 'Creating '+filename
        arcpy.CreateFeatureclass_management(r'C:\TEMP\Dump\TEST.gdb', filename, 'POINT', '', '', '', '')
    del filename
    
def deleteFiles():
    arcpy.env.workspace = r'C:\TEMP\Dump\TEST.gdb'
    lyrs = arcpy.ListFeatureClasses()
    for lyr in lyrs:
        print 'Deleting '+lyr
        arcpy.Delete_management(lyr, '')
    del lyr, lyrs

def runTimes(runNum):
    createFiles()
    deleteFiles()
    
x = 0
while x != 100:
    print 'Run'+str(x)
    runTimes(x)
    x += 1


We are actually running a custom tool my colleague developed from python to run data extraction from a SQL database and do a few geoprocessing on data like nearest neighbor interpolation, creating of feature classes and raster datasets, mosaicking and deleting of such files within a single FGDB. And it runs well but most of the time fails due to workspace is read only or schema locks on a feature that prevents creation or deletion of featureclasses. So I was thinking just to simulate that process, well, my simple script does look a bit crude, but my goal was just to see if the FGDB will lock up again, and it did. Maybe you could shed a light more on why workspaces behave like this.
0 Kudos