Trouble Removing Schema Lock After arcpy.MakeFeatureLayer (Pro Only)

14181
14
06-13-2017 06:31 AM
ScottDavis
Frequent Contributor

I'm having trouble removing a schema lock from a feature class in a file geodatabase after running arcpy.MakeFeatureLayer(). The typical arcpy.Delete(layer) isn't working.

I only experience this issue when running this with Pro's (1.4.1) version of python. It works fine with ArcGIS Desktop. Any ideas?

Demo

Here's a script that demonstrates the issue:

import arcpy
arcpy.management.CreateFileGDB(r"C:\temp", "test.gdb", "CURRENT")
arcpy.management.CreateFeatureclass(r"C:\temp\test.gdb", "Test", "POLYGON", None, None, "ENABLED", "PROJCS['WGS_1984_Web_Mercator_Auxiliary_Sphere',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Mercator_Auxiliary_Sphere'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',0.0],PARAMETER['Standard_Parallel_1',0.0],PARAMETER['Auxiliary_Sphere_Type',0.0],UNIT['Meter',1.0]];-20037700 -30241100 10000;#;#;0.001;#;#;IsHighPrecision", None, 0, 0, 0)
arcpy.env.workspace = "c:\\temp\\test.gdb"
layer = arcpy.MakeFeatureLayer_management('Test', 'NewTest')
arcpy.Delete_management('NewTest', 'Layer')
del layer
arcpy.env.workspace = ""
#: lock still exists
arcpy.Delete_management("C:\\temp\\test.gdb") 

"""
This throws an exception: 
arcgisscripting.ExecuteError: ERROR 000601: Cannot delete C:\Temp\test.gdb.  
May be locked by another application.
"""
Tags (2)
14 Replies
AdrianWelsh
MVP Honored Contributor

Hi Scott,

I looked around and did not see any solutions to this. I wonder if this is a bug and potentially fixed in ArcGIS Pro 2.0. I am tagging rleclair-esristaff‌ to see if he has any insight.

Also, it might help to form your discussion into a question in order to get more responses and more visibility.

0 Kudos
Robert_LeClair
Esri Notable Contributor

Hi Scott - first I have to mention I'm not a Python guru but have a suggestion or two to investigate based upon my research.  First, you may want to run the "Analyze Tools for Pro" GP tool either in ArcMap or Pro.  The GP tool will test your Python script to see if it will run in Python 3.5.2 (Pro's version of Python) and write a text file out to troubleshoot.  Second, I haven't seen a bug mentioning the error codes but there was a suggestion to put a time.sleep() into your code for asynchronous processes.  Refresh the folder.  Does the LOCK remain?  Pls advise.

0 Kudos
ShaunWalbridge
Esri Regular Contributor

Hello Adrian and Robert,

We've looked at this issue and it looks like a bug. It's a regression from 10.x where this behavior isn't observed. There are a few workarounds for this issue for anyone coming to this issue later:

1. Use an in-memory workspace for the temporary results you want to store

2. Delete the feature class created just prior to the layer creation

3. Run the MakeFeatureLayer step in a subprocess, so that it's process terminates prior to you trying to do further work in the GDB.

Cheers, Shaun

0 Kudos
AdamThomas3
Emerging Contributor

Scott Davis - I am experiencing the exact issue you describe, but I am using ArcMap Desktop 10.5.1 using Python 2.7.

Here's my workflow in a single script:

1. Create temporary gdb and feature class (on C:\temp for speed)

2. Create feature layer from feature class (in-memory) to perform some analysis

3. Output results to desired destination

4. Delete temporary gdb (fail)

Robert LeClair‌ - The lock will remain for asynchronous processes until the python process has ended (and raised an error in this case), so adding time.sleep() during the process has no impact. I can delete the gdb in a separate process once the first process has ended.

Shaun Walbridge‌ - 1. The lock gets added to the source data and has no impact on the feature layer, so using in-memory for the feature layer does not help. 2. Not sure how you can make a feature layer from a feature class that has been deleted. 3. This appears to be the only solution for now.

AdamThomas3
Emerging Contributor

This is logged as BUG-000086232 here: https://support.esri.com/en/bugs/nimbus/QlVHLTAwMDA4NjIzMg==, but there is currently no plan to resolve it.

0 Kudos
AdamThomas3
Emerging Contributor

One possible solution is to use the scratch workspace instead of creating your own temp workspace. The lock will still be created in the scratch workspace (it only exists while the python process is running, just like before), but the scratch workspace isn't intended to be deleted, so you won't need to do any cleanup afterwards. Arcpy will always have access to create the same scratch workspace for each process that is created. In my case the scratch workspace is created in C:\Users\{user}\AppData\Local\Temp\scratch.gdb.

Here is the sample script from above, modified to use the scratch workspace instead:

import arcpy
arcpy.env.workspace = arcpy.env.scratchGDB
arcpy.management.CreateFeatureclass(arcpy.env.workspace, "Test", "POLYGON", None, None, "ENABLED", "PROJCS['WGS_1984_Web_Mercator_Auxiliary_Sphere',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Mercator_Auxiliary_Sphere'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',0.0],PARAMETER['Standard_Parallel_1',0.0],PARAMETER['Auxiliary_Sphere_Type',0.0],UNIT['Meter',1.0]];-20037700 -30241100 10000;#;#;0.001;#;#;IsHighPrecision", None, 0, 0, 0)
layer = arcpy.MakeFeatureLayer_management('Test', 'NewTest')‍‍‍‍

# No cleanup required here.
# The scratch workspace is recreated every time it is used.
# Pretend it never existed!‍‍‍‍‍
ShaunWalbridge
Esri Regular Contributor

Scott DavisAdam Thomas This issue has been resolved in Pro 2.3. Thanks for bringing it up, and if you see something similar once you have Pro 2.3, let me know.

Cheers,

Shaun

CC Kory Kramer

FranciscoCosta1
Regular Contributor

Shaun WalbridgeScott Davis Using Jupyter with ArcGIS Pro open, seems also is occurring , i.e. :


ArcGIS Pro v2.4.1 is open: arcpy.env.workspace = 'D:\\some_projects\\cogubem\\scratch.gdb'

No locks in here.

Jupyter Nb is open: arcpy.env.workspace = 'C:\\%users%\\AppData\\Local\\Temp\\scratch.gdb'

ExecuteError: ERROR 000464: Cannot get exclusive schema lock. Either being edited or in use by another application or service. Failed to execute (Delete).

And Jupyter Nb is running a cloned conda environment of arcpy accordingly.

Found way of sorting it although is for case here so i would like to know if in your case this also works.

0 Kudos
ShaunWalbridge
Esri Regular Contributor

Francisco,

In the case you describe I think the error is accurate, since there are two separate processes trying to access the workspace simultaneously. You could do something like create a temporary workspace for each process, and then write to the shared workspace only at the end to avoid having this overlap, or if your data is small enough, use in memory workspaces in each process till the final write step.

Cheers,

Shaun

0 Kudos