Select to view content in your preferred language

arcpy & GDB .lock files: Best Practice for executing arcpy tools while minimizing .lock file persistance

7286
5
12-02-2014 04:28 PM
PF1
by
Frequent Contributor

FYI for the community. 

The Problem:

So I recently ran into a problem where I wrote a nice little python script to export some feature classes to a FGDB and then zip the FGDB for delivery.  Every time I tried to zip the FGDB in the same python process as my 'feature class to feature class conversion' execution the zip routine would fail with a message that looked similar to:

Traceback (most recent call last):

  File "<OBSCURE_PYTHON_FILE>.py", line <#>, in <module>

   ...

    ...

  File "C:\Python27\ArcGIS10.2\Lib\zipfile.py", line 1149, in write

    with open(filename, "rb") as fp:

IOError: [Errno 13] Permission denied: u'<OBSCURE_PATH>\\<FGDB_NAME>.gdb\\_gdb.<HOSTNAME>.23104.8124.sr.lock'

Indicating that something within my process had an 'sr' lock (schema lock??).  By placing some time.wait() statements... I tracked it back to the command:

res=arcpy.FeatureClassToFeatureClass_conversion(fc,destination_fgdb,fc)

It appears that the arcpy.FeatureClassToFeatureClass_conversion function was placing a .sr.lock file in the FGDB and that it was not released until the code completed execution. This is a problem since I'm trying to zip up the FGDB after placing some data in it. 

The FIX:

Based on this thread... Lucas Danzinger came to the conclusion that:

The reason it worked when I had it in a function is because the variables were automatically deleted once the code in the function completed successfully.

so I took the arcpy.FeatureClassToFeatureClass_conversion() call and placed in a function called 'tmp()'.  As soon as I did that the .sr.lock files disappeared before my time.wait() executed and the zip process finished with success. 

An alternative (and my preferable method) is to delete the 'res' object (arcpy.Result) once completed.  That also effectively removed the sr.lock file. 

res=arcpy.FeatureClassToFeatureClass_conversion(fc,destination_fgdb,fc)
if res.status !=4:
    #do something
del res

So bottom line...

Get in the habit of executing arcpy operations in a function (the .lock will be bound to the scope of the function* and upon completion of the function the .lock file should be cleanly removed) OR delete the result object returned by the function call. 

*If the result is written to a global variable, then the scope would be larger than the function and would most likely still retain a .lock file. 

Hope this article helps others with these type of issues. 

5 Replies
saraswathiemani
New Contributor

Thank you very much for the post, saved lot of time and frustration. very well written post.

davedoesgis
Frequent Contributor

I just upgraded to 2.7.1. I think my previous version was 2.5, but not totally sure. I was not having any lock issues before, but now I get a lock file just running the CreateFileGDB function (I do other processing, but even just creating a new FGDB comes up locked). This work-around does not solve my problem, unfortunately. 

I have to exit Python altogether to get ArcPy to remove the lock file -- see code sample below running it from the command prompt. I may have to run my code in pieces as subprocess calls, which sounds like a Python programmer's nightmare. Thanks, Esri. 

 

 

>>> from arcpy.management import CreateFileGDB
>>> def cf():
...     c = CreateFileGDB(r'D:\temp\2021-06', 'test.gdb')
...     del c
...
>>> cf()
>>> from pathlib import Path
>>> p = Path('D:/temp/2021-06/test.gdb')
>>> [str(x) for x in p.glob('*.lock')]
['D:\\temp\\2021-06\\test.gdb\\_gdb.ORG027.4232.16252.sr.lock']
>>> exit()

D:\temp\2021-06>cd test.gdb
D:\temp\2021-06\test.gdb>dir *.lock
Directory of D:\temp\2021-06\test.gdb
File Not Found

 

 

 

0 Kudos
Adrian
by
Occasional Contributor

Same here. Just upgraded from ArcGIS Pro 3.0 to 3.3 and the locking nightmare got worse. We found a new function which leaves lock files behind in this version: arcpy.da.Walk

If you want to process a file geodatabase after an arcpy function (e.g. delete, ZIP, copy, etc.) you are lost. The only workaround is a wrapper around every arcpy function which is running in a separted subprocess. When does Esri get the lock problem under control? The large number of threads proves that this is a serious problem.

0 Kudos
SRK
by
Emerging Contributor

Just use

arcpy.management.ClearWorkspaceCache()

 when you are done with the arcpy stuff

Adrian
by
Occasional Contributor

Thank you for your answer.

We lately found this workaround with ClearWorkspaceCache() as well and we use it quite often. It seems to work better with releasing locks since ArcGIS Pro 3.3 and it's good to have a tool to release locks on demand.

However, as far as I know, releasing locks is not documented as functionaliy of this method. Additionally it is and remains a workaround for a dirty implementation in arcpy. It's usually not in the responsibility of the user of a library to keep the processed objects in a responsible state. It would be great, if this problem could be addressed in one of the upcoming releases.