Mystery locks on feature classes

5371
10
06-11-2013 12:49 PM
KerryAlley
Occasional Contributor
I have a short geoprocessing script with an arcpy.Delete_management() line that keeps throwing a lock error:  "ERROR 000464: Cannot get exclusive schema lock.  Either being edited or in use by another application."

Details: (code pasted below too)
I can successfully delete "lrs_buffer" if I skip the lines of code defining the mxd and the layers.  However, I can't think of why there would be a lock on "lrs_buffer" due to creating the map document object, since "lrs_buffer" had never been added to the mxd.  Also, if I'm using a file geodatabase as a workspace and observe through Windows Explorer, no lock files for "lrs_buffer" are created in this workflow.  I am having the same problem if I use a personal geodatabase as my workspace.

import arcpy
arcpy.env.overwrite = True
workspace = r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPointsQAQC.gdb"
arcpy.mapping.MapDocument(r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPoints.mxd")

rtlogpts = arcpy.mapping.ListLayers(mxd, "*rtlogpts")[0] #feature class on SDE
lrs = arcpy.mapping.ListLayers(mxd, "*lrs_route_twn")[0] #feature class on SDE
rdsmall_copy = arcpy.mapping.ListLayers(mxd, "rdsmall_arc")[0] #feature class in workspace

bufferFC = "lrs_buffer"
if arcpy.Exists(bufferFC):
    arcpy.Delete_management(bufferFC)
arcpy.Buffer_analysis(lrs.dataSource, bufferFC, "2 Meters", dissolve_option = "ALL")


Any ideas about what is happening?

Thanks!
Kerry
Tags (2)
0 Kudos
10 Replies
ChrisSnyder
Regular Contributor III
What if you try to delete the FC(s) before you run all the list functions?
0 Kudos
RhettZufelt
MVP Frequent Contributor
import arcpy
arcpy.env.overwrite = True
workspace = r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPointsQAQC.gdb"
mxd = arcpy.mapping.MapDocument(r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPoints.mxd")

rtlogpts = arcpy.mapping.ListLayers(mxd, "*rtlogpts")[0] #feature class on SDE
lrs = arcpy.mapping.ListLayers(mxd, "*lrs_route_twn")[0] #feature class on SDE
rdsmall_copy = arcpy.mapping.ListLayers(mxd, "rdsmall_arc")[0] #feature class in workspace

bufferFC = r"V:\Projects\Shared\RouteLogSystem\ArcGIS_10_Prototype\Prototype_V10\rtlogptsQAQC\RouteLogPointsQAQC.gdb\lrs_buffer"
if arcpy.Exists(bufferFC):
    arcpy.Delete_management(bufferFC)
#arcpy.Buffer_analysis(lrs.dataSource, bufferFC, "2 Meters", dissolve_option = "ALL")


Any ideas about what is happening? 

Thanks! 
Kerry


Not sure what is happening, but I get some errors also.

however, when modified as I did above, I get no errors and it deletes the FC.

R_

I should add, all my testing is in the standalone IDE, not within the ArcMap python window.
0 Kudos
curtvprice
MVP Esteemed Contributor
That is really wierd, if it isn't a system path, it seemed to convert the string to a feature layer at validation time. This is really wierd. But I did notice that you did not set the environment workspace, which could be your issue.

arcpy.env.workspace = workspace
arcpy.mapping.MapDocument("RouteLogPoints.mxd")
...
bufferFC = "lrs_buffer"


If that does not work, you can do this:

bufferFC = os.path.join(workspace, "lrs_buffer")
0 Kudos
Luke_Pinner
MVP Regular Contributor
I should add, all my testing is in the standalone IDE, not within the ArcMap python window.

Which standalone IDE? 

Note that pythonwin uses the same python interpreter process for the IDE and for debugging.  It's possible that any feature class locks created on the first run of a script will not get cleared (as the python process hasn't been restarted and the arcpy module is still in memory) causing subsequent runs of the script to fail.  I moved from pythonwin to pyscripter a few years ago as it supports remote dubugging and the remote python interpreter is reinitialised before each run of the script.
0 Kudos
KerryAlley
Occasional Contributor
Oops!  Thanks Rhett for pointing out my typos!  However, those typos were only in my forum post, and not in my script.  I did have my workspace environment set, and had the �??mxd = �?? in my line defining the map document.  I�??m also using standalone IDEs. 

Since Rhett confirmed that the workflow should work, I just gave up, and started implementing work-arounds�?�

However�?� Today I can get my scripts to work if I use personal geodatabases instead of file geodatabases!  I have no clue why I had problems with a personal geodatabase yesterday (operator error?? #$%&*!).  So I may have prematurely discarded my original assumption that my issues are related to the Windows updates.  However, I don�??t have any of the listed �??offending�?� updates installed on my computer (http://support.esri.com/en/knowledgebase/techarticles/detail/41119), and my issues have a slightly different flavor than those described in the Tech Article.

For the record, I observed other related issues while troubleshooting my attempts to create a buffer feature class in a file geodatabase:
- Model Builder errors implied that my workspace was �??read only�?� when it wasn�??t.
- Arcpy.Buffer_analysis(�?�) caused ERROR 000210: Cannot create output.
-  running the buffer tool from ArcMap doesn�??t seem to have a problem.

Occasionally I can successfully run the delete, buffer, or other tools using a file geodatabase without getting errors, but I can�??t for the life of me figure out why because the successes are so infrequent and seem to be independent of rebooting or creating new workspaces/mxd�??s.  I have one project that is usually successful in deleting/creating feature classes in a file geodatabase, but the workspace database was created using ArcCatalog 10.0 a year ago.

So, any ideas what's up?
0 Kudos
RhettZufelt
MVP Frequent Contributor
Kerry,

That is weird.  Normally, when I am having problems with different data sources, I convert to a FGDB item and the problems normally go away.

Normally, if I make sure to delete ANY variable that has been assigned to the dataset or to a map document that contains that dataset (or has EVER contained it, even if has been deleted)(for this reason, I no longer "modify" existing mxd's.  always start with fresh ArcMap document, and copy/paste datasets if I want a "copy" as this doesn't carry the links/locks to "old", once added datasets) I can delete them fine.  Sometimes the overwriteoutput setting works, but others, you must delete it first.

It didn't look like it in your example, but perhaps you simplified it for posting, but, if you have any of your FC's or tables loaded from a feature dataset within the FGDB, ALL features within that feature dataset will be locked, not just the items currently loaded.  You will have to delete any/all variable references to any of the datasets within that feature dataset before you will be able to delete.

R_
0 Kudos
JohnDye
Occasional Contributor III
I could be wrong, but I think your answer is here.

"When a process ends prematurely or crashes, .lock files may be temporarily left behind in the geodatabase folder. ArcGIS eventually removes these files in future sessions as new locks are taken. In the meantime, such files do not continue to lock data, and as they take up no disk space, removing them provides no benefit. "

In my experience, I've found this isn't neccesarily true. Especially when working with a Geodatabase that is on a Network Share. Numerous times I have tried to access an FC that someone else processed against, only to find that they are locking the file even though they don't even have ArcMap open (Note: If you look at the Geodatabase in Windows Explorer, the name of the Lock file reveals the computer the lock came from.) As a result, the first and last thing all of my scripts do is compact the Geodatabase in order to ensure no that no residual locks prevent the script from running and that no locks remain in the GDB upon exit.

"The geoprocessing Compact tool and the copy and paste operations, available in the Catalog tree, also delete unused .lock files. The geoprocessing Compact tool compacts the geodatabase, removing all inactive .lock files in the process. Copying and pasting a file geodatabase removes all inactive .lock files from the source geodatabase before copying data to the new geodatabase."

Hope that helps.
0 Kudos
RhettZufelt
MVP Frequent Contributor
Kerry,

As John pointed out, the compact tool will remove "runaway" lock files for you.

However, this reminded me of another issue that you could be facing.

If someone else has loaded that FC, or any FC from the feature dataset it is in, it will lock the data.  HOWEVER, if they do NOT have write access to the FGDB, you will not see a lock file in there.

I have been locked out several times by my read-only users, and there is no way to tell if/when they have a lock until you get a permission denied lock error.

When this happens, I have to go to my IT guys, and have them look at what user is connected through my network share, and then have them kick them off, or contact the user and have them CLOSE ArcGIS products (Map and Catalog).  Can't just remove the FC, need to close the app.

In case this applies,

R_
0 Kudos
KerryAlley
Occasional Contributor
Thanks for the insight Rhett!  I was about to try using feature datasets to see if they would help, so you saved me from more frustrated blundering.  It is extremely unlikely that anybody else had opened anything relevant to my workflow. 

While writing this post, an officemate stepped out to lunch so I jumped on his computer to test everything out on that machine.  I was not able to reproduce the issues unless I used preexisting FGDB and MXDs (that I presume had been corrupted).  I started troubleshooting to determine what part of my workflow could be corrupting the FGDB and/or MXD, but now I can't reproduce the problems on my computer either.  I still get the errors, however, if I use the FGDBs and MXDs that I created earlier today and yesterday, and a compact doesn�??t change that.  I�??m assuming that those files were permanently corrupted.

Whatever the issue was before, it was evident after each of my 3 complete "fresh starts" (creating a new FGDB and MXD after rebooting) and many partial fresh starts (various combinations thereof).  In addition to getting Error 210 from Python and Model Builder, I also couldn�??t export a data selection from ArcMap to the FGDB (the ArcMap errors weren�??t numbered: �??Workspace or data source is read only.�?� and �??The table was not found. [Export_Output]�?�).

If the problem returns, I�??ll post more information.  Otherwise I'm keeping my fingers crossed!
Thanks for all the help!
0 Kudos