arcpy & GDB .lock files: Best Practice for executing arcpy tools while minimizing .lock file persistance

6608
2
12-02-2014 04:28 PM
PF1
by
Occasional Contributor II

FYI for the community. 

The Problem:

So I recently ran into a problem where I wrote a nice little python script to export some feature classes to a FGDB and then zip the FGDB for delivery.  Every time I tried to zip the FGDB in the same python process as my 'feature class to feature class conversion' execution the zip routine would fail with a message that looked similar to:

Traceback (most recent call last):

  File "<OBSCURE_PYTHON_FILE>.py", line <#>, in <module>

   ...

    ...

  File "C:\Python27\ArcGIS10.2\Lib\zipfile.py", line 1149, in write

    with open(filename, "rb") as fp:

IOError: [Errno 13] Permission denied: u'<OBSCURE_PATH>\\<FGDB_NAME>.gdb\\_gdb.<HOSTNAME>.23104.8124.sr.lock'

Indicating that something within my process had an 'sr' lock (schema lock??).  By placing some time.wait() statements... I tracked it back to the command:

res=arcpy.FeatureClassToFeatureClass_conversion(fc,destination_fgdb,fc)

It appears that the arcpy.FeatureClassToFeatureClass_conversion function was placing a .sr.lock file in the FGDB and that it was not released until the code completed execution. This is a problem since I'm trying to zip up the FGDB after placing some data in it. 

The FIX:

Based on this thread... Lucas Danzinger came to the conclusion that:

The reason it worked when I had it in a function is because the variables were automatically deleted once the code in the function completed successfully.

so I took the arcpy.FeatureClassToFeatureClass_conversion() call and placed in a function called 'tmp()'.  As soon as I did that the .sr.lock files disappeared before my time.wait() executed and the zip process finished with success. 

An alternative (and my preferable method) is to delete the 'res' object (arcpy.Result) once completed.  That also effectively removed the sr.lock file. 

res=arcpy.FeatureClassToFeatureClass_conversion(fc,destination_fgdb,fc)
if res.status !=4:
    #do something
del res

So bottom line...

Get in the habit of executing arcpy operations in a function (the .lock will be bound to the scope of the function* and upon completion of the function the .lock file should be cleanly removed) OR delete the result object returned by the function call. 

*If the result is written to a global variable, then the scope would be larger than the function and would most likely still retain a .lock file. 

Hope this article helps others with these type of issues. 

2 Replies
saraswathiemani
New Contributor

Thank you very much for the post, saved lot of time and frustration. very well written post.

davedoesgis
Occasional Contributor III

I just upgraded to 2.7.1. I think my previous version was 2.5, but not totally sure. I was not having any lock issues before, but now I get a lock file just running the CreateFileGDB function (I do other processing, but even just creating a new FGDB comes up locked). This work-around does not solve my problem, unfortunately. 

I have to exit Python altogether to get ArcPy to remove the lock file -- see code sample below running it from the command prompt. I may have to run my code in pieces as subprocess calls, which sounds like a Python programmer's nightmare. Thanks, Esri. 

 

 

>>> from arcpy.management import CreateFileGDB
>>> def cf():
...     c = CreateFileGDB(r'D:\temp\2021-06', 'test.gdb')
...     del c
...
>>> cf()
>>> from pathlib import Path
>>> p = Path('D:/temp/2021-06/test.gdb')
>>> [str(x) for x in p.glob('*.lock')]
['D:\\temp\\2021-06\\test.gdb\\_gdb.ORG027.4232.16252.sr.lock']
>>> exit()

D:\temp\2021-06>cd test.gdb
D:\temp\2021-06\test.gdb>dir *.lock
Directory of D:\temp\2021-06\test.gdb
File Not Found

 

 

 

0 Kudos