Cannot delete temp gdb due to lock from executing program

1634
12
Jump to solution
06-03-2022 02:30 PM
SFM_TravisBott
Occasional Contributor III

I am having trouble with the last phase of a script for a script tool: I cannot figure out how to delete my temporary gdb to close out the script. Having the program that's executing the script open retains a lock in the gdb...for instance when I close Jupyter (or VS code, or IDLE) I can see the lock file in the gdb disappear. How can I construct my code so this isn't an issue when made into a script tool?

There have a been a few other posts on this topic and none of those solutions seem to be quite right. The general advice has been to add an arcpy.env.overwriteOuput = True, and that has not been effective.

The code that creates the gdb:

#Create temporary Workspace
arcpy.AddMessage("\nCreating temporary workspace")
tempName = "incorp" + outVersion + "_TEMP"
tempGDB = os.path.join(outFolder, tempName + ".gdb")
arcpy.management.CreateFileGDB(outFolder, tempName)
arcpy.env.workspace = tempGDB
arcpy.env.overwriteOutput = True

The script then goes on and does several operations within that temp gdb, clipping, erasing, querying, adding fields, etc. Then when I go to delete it...

#Delete intermediate data
arcpy.AddMessage("\nDeleting intermediate data")
todel = (tempGDB, union, lastIncorp_xml)
for dataset in todel:
if arcpy.Exists(dataset):
arcpy.management.Delete(dataset)

I get ERROR 000601 - Cannot delete [path].gdb. May be locked by another application. Failed to execute (Delete).

Any thoughts on what can be done so the tool can clean up after itself?

 

0 Kudos
1 Solution

Accepted Solutions
by Anonymous User
Not applicable

If it's temp data, consider using the memory workspace so it's not writing to disk.  There is the arcpy environment's scratch database you can use as well, but you'll still have to manually delete the items.

If your two other items (union, lastIncorp_xml) in the todel tuple are datasets in the gdb that you are trying to delete, it could be creating a lock when the tuple is created, prohibiting the deletion of the database. If they are just references (i.e. assigned to gp output), you can 'del' them, and delete the tempGDB, or maybe try reordering the items to delete.

View solution in original post

12 Replies
Brian_Wilson
Occasional Contributor III

If you have some data structure hanging around in memory like for example a cursor on the data then you need to delete it to remove the lock, for example if it's called "rows" then you say "del rows". Here is a sample 

cursor = arcpy.SearchCursor("roads", '"TYPE" <> 4')
for row in cursor:
    print("Name: {0},  CFCC code: {1}".format(row.NAME, row.CFCC))

del cursor, row

 

0 Kudos
BlakeTerhune
MVP Regular Contributor

Agreed. Or better yet, use it in a with statement so it gets cleaned up even if there's an error before it can del.

with arcpy.SearchCursor("roads", '"TYPE" <> 4') as cursor:
    for row in cursor:
        print("Name: {0},  CFCC code: {1}".format(row.NAME, row.CFCC))
0 Kudos
BlakeTerhune
MVP Regular Contributor

Although I've not tested it, I heard Compact might clear out erroneous locks.

0 Kudos
by Anonymous User
Not applicable

If it's temp data, consider using the memory workspace so it's not writing to disk.  There is the arcpy environment's scratch database you can use as well, but you'll still have to manually delete the items.

If your two other items (union, lastIncorp_xml) in the todel tuple are datasets in the gdb that you are trying to delete, it could be creating a lock when the tuple is created, prohibiting the deletion of the database. If they are just references (i.e. assigned to gp output), you can 'del' them, and delete the tempGDB, or maybe try reordering the items to delete.

SFM_TravisBott
Occasional Contributor III

@Anonymous User I am not an expert pythonista, so I am not sure if my workflows are best practices, but I wanted to write to disk so that I can verify intermediate data as I work downstream. This seems like a reasonable thing to keep in the workflow so if there are problems with the script's execution later it can maybe be troubleshooted (troubleshot?). 

union is in the tempGDB, lastIncorp_xml is not. When I have the contents of the gdb open in File Explorer I notice the lock file disappear when the program executing python is closed, so there's something about accessing it in this way that creates the lock file (which supports your argument for doing it in memory, but I would like it on disk if I can). I have tried re-ordering items in the deletion but to no avail. 

I will try to put them in as references and se if 'del' works. 

0 Kudos
Brian_Wilson
Occasional Contributor III

New theory. You are using several calls into geoprocessing tools, clip, query, etc and you have no control over the fact that one or more of them are called from your python process and are holding onto the file lock. That's a tough one. Since the Esri code is running in the same space as your calling code, their code won't clean up until your main program ends.

If this is true then you'd need to call the errant tool in a subprocess so that it would exit cleanly. That would be a pain. You'd need to figure out which tool(s) needed to be isolated. You'd need to learn about executing subprocesses. Not a bad thing to know really but still a pain. There is a lot more overhead while the subprocess loads a copy of python, loads the packages blah blah blah and then loads the tool and runs it.

I think you should try JeffK's idea first, use in_memory -- and it's much faster. Leave the existing on-disk code in there and comment it out so you can easily put it back if you need debugging. Ha -- or put in a "copy feature class" to write the intermediate data to disk once the processing step is done. That would bypass the lock file issue, let you combine several steps working in memory for more speed, and you could only copy to disk when you need to examine intermediate results.

I think Esri probably sidesteps this issue with Model Builder because each little colored box invokes a new process? I stopped using Model Builder a few years ago so I never tell people to use it anymore.

 

0 Kudos
SFM_TravisBott
Occasional Contributor III

I am still not fully clear what was holding it open but the "memory" option worked fine. 

#Set memory workspace
arcpy.AddMessage("\nCreating temporary workspace")
memory = r"memory"
arcpy.env.workspace = memory
#Clip by boundary layer arcpy.AddMessage("\nClipping to state boundary") boeClipped = os.path.join(memory, "boeClipped") arcpy.analysis.Clip(boeDataSet, boundary, boeClipped)

For those out there that are novices like me: you still have to delete the layers created in memory, otherwise you can't run the tool again as it'll throw errors saying items exist. And apparently you can't set overwrite to True if using memory.  

0 Kudos
by Anonymous User
Not applicable

That is good that you found a solution, nice work!  I do find it odd that it requires you to still delete the items because the memory workspace only exists for as long as the script is running. Once the script completes, the memory workspace should be deleted and the RAM used for the memory workspace released. You may try the legacy 'in_memory' workspace too. There are some differences between the two and one may work better than the other for certain tasks.

0 Kudos
by Anonymous User
Not applicable

Your windows explorer could be stopping the deletion because the gdb is 'open'/ being used by another process. Sounds flaky but it happens because you are trying to delete a 'folder', but its being viewed by windows explorer and it throws a fit.

Yeah, do your work in memory and then use CopyFeatures() wherever/whenever you want to export as a QC dataset as Brian laid out. I do it all the time during script building and when it is running correctly, I comment out the export statements.

0 Kudos