Select to view content in your preferred language

Cannot delete temp gdb on subsequent iterations because of lock

1457
4
Jump to solution
01-14-2021 07:50 AM
FranklinAlexander
Frequent Contributor

I am having a major problem preventing me from running a python 3x script. I am iterating over dozens of folders and creating lots of temporary data on each iteration. I am creating a temp gdb to store the temp data, then deleting and creating a new one on each pass. I have done this before with no issue, but now, after the first iteration I am getting a lock error Cannot delete C:/WebDev/Projects/...\temp.gdb. May be locked by another application. I want to emphasize that nothing is open, all programs including ArcPro have been closed before running the script and the error persists. I know this is a common issue and am hoping that someone can please give me a work-around. I have also tried just overwriting the gdb, but cannot do that either. 

I have found this statement in the esri documentation regarding schema locks:

"Scripts may always update the schema of data created by tools within the same script, even if the current workspace is being used by another application."

If I understand this correctly, as long as the data being created from within the script (including the temp gdb) I should not be getting a lock error when attempting the delete it. 

Can anyone please shed some light on this?

Thanks! 

Tags (3)
0 Kudos
1 Solution

Accepted Solutions
FranklinAlexander
Frequent Contributor

Yes, I did try that and it works great once I have finished developing and testing the script:

tempDB = arcpy.env.scratchGDB

While testing, however, I need to be able to see the temp data so I can troubleshoot my errors and I cannot do that if the temp data is in memory. I did find a solution that seems to be working so far. I just create a fresh temp gdb first (not inside the for loop) and set arcpy.env.overwriteOutput = true. I don't have any lock errors deleting the temp gdb when on the 1st iteration, so that works fine. Just a little frustrating that I have been testing this script for weeks without any issue (even with ArcPro open!) and the last few days I started getting the lock error. 

Thanks for the suggestion and it's nice to have that option when setting the script to run on a schedule from the enterprise server. Less that can break!

View solution in original post

0 Kudos
4 Replies
DanPatterson
MVP Esteemed Contributor

is in_memory not an option?

Write geoprocessing output to memory—ArcGIS Pro | Documentation


... sort of retired...
0 Kudos
FranklinAlexander
Frequent Contributor

Yes, I did try that and it works great once I have finished developing and testing the script:

tempDB = arcpy.env.scratchGDB

While testing, however, I need to be able to see the temp data so I can troubleshoot my errors and I cannot do that if the temp data is in memory. I did find a solution that seems to be working so far. I just create a fresh temp gdb first (not inside the for loop) and set arcpy.env.overwriteOutput = true. I don't have any lock errors deleting the temp gdb when on the 1st iteration, so that works fine. Just a little frustrating that I have been testing this script for weeks without any issue (even with ArcPro open!) and the last few days I started getting the lock error. 

Thanks for the suggestion and it's nice to have that option when setting the script to run on a schedule from the enterprise server. Less that can break!

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

With nothing open that access that FGDB, make sure there are no orphaned lock files.  If a previous run terminated prematurely or incorrectly, it might be an orphan lock that the script is seeing and not allowing the FGDB to be cleaned up.

0 Kudos
FranklinAlexander
Frequent Contributor

Thanks for the suggestion. I am not sure I am able to check on that since I don't have admin privileges. However, I don't think that is the issue since the script doesn't throw an error on the first iteration. If there is an orphan lock, then I think it is being caused by some process in the script and not by ArcPro or some other program not releasing it. At this point, I have found a workaround, so it's not an issue any more.