Hi guys,I was doing a test. Created a simple script that creates 3 blank FCs in a FGDB then delete them right after. I made these short codes to loop several times within the same FGDB and at some point, the schema lock or workspace read only error appeared.Here is the simple script I was running,
import arcpy
filenames = ['data1', 'data2', 'data3']
def createFiles():
for filename in filenames:
print 'Creating '+filename
arcpy.CreateFeatureclass_management(r'C:\TEMP\Dump\TEST.gdb', filename, 'POINT', '', '', '', '')
del filename
def deleteFiles():
arcpy.env.workspace = r'C:\TEMP\Dump\TEST.gdb'
lyrs = arcpy.ListFeatureClasses()
for lyr in lyrs:
print 'Deleting '+lyr
arcpy.Delete_management(lyr, '')
del lyr, lyrs
def runTimes(runNum):
createFiles()
deleteFiles()
x = 0
while x != 100:
print 'Run'+str(x)
runTimes(x)
x += 1
We are actually running a custom tool my colleague developed from python to run data extraction from a SQL database and do a few geoprocessing on data like nearest neighbor interpolation, creating of feature classes and raster datasets, mosaicking and deleting of such files within a single FGDB. And it runs well but most of the time fails due to workspace is read only or schema locks on a feature that prevents creation or deletion of featureclasses. So I was thinking just to simulate that process, well, my simple script does look a bit crude, but my goal was just to see if the FGDB will lock up again, and it did. Maybe you could shed a light more on why workspaces behave like this.