Cannot delete files in FGDB - Need Arcpy solution

2239
6
Jump to solution
01-03-2019 06:17 AM
MKF62
by
Occasional Contributor III

I have created a script where at the end of it, I transfer records from and FGDB to a SDE database. I use the editor and believe I stop the operations correctly. At the end of this process, there are no longer any lock files in my FGDB. When I go to delete the entire FGDB, it gets hung up on .gdbtable and the deletion process will stop (and I assume if that got fixed, it would get hung up on .gdbindexes, .gdbtablx, or .spx). 

I have tried compacting the database and this does not work. I have tested that I can obtain a schema lock on one of the feature classes inside the FGDB and it says I can, I think this indicates that exclusive locks are not present. When I look in file explorer, there are no locks present. I really need an arcpy solution to this problem so the FGDB deletes successfully.

This is my code, starting at the editor process. Before this piece of code, I use search cursors on the FGDB with the with/as notation (so cursor locks should be deleted automatically).

#Instiantiate an editing object on the HbMonitoring database.
editor = arcpy.da.Editor(hb_db_con)

try:
    arcpy.AddMessage("Updating Observers table with new observers...")
    editor.startEditing(False, False)
    editor.startOperation()
    #Create insert cursor to insert new observers into the Observers database table
    with arcpy.da.InsertCursor(obsvTable, obsvFields) as obsvCursor:
        for item in newObservers:
            obsvCursor.insertRow(item)
    editor.stopOperation()
    editor.stopEditing(True)
except:
    #do stuff

try:
    arcpy.AddMessage("Copying data to Patches table...")
    editor.startEditing(False, False)
    editor.startOperation()
    #Create insert cursor to insert new rows into the Patches table
    with arcpy.da.InsertCursor(PatchesFC, icFields) as iCursor:
        arcpy.AddMessage("Inserting patch row...")
        for item in patchInsert:
            iCursor.insertRow(item)
    editor.stopOperation()
    editor.stopEditing(True)
except:
    #do stuff

try:
    arcpy.AddMessage("Copying data to Protective Cover table...")
    editor.startEditing(False, False)
    editor.startOperation()
    #Create insert cursor to insert new rows into the ProtectiveCover table
    with arcpy.da.InsertCursor(proCovFC, iPCFields) as pcCursor:
        for item in pcInsert:
            pcCursor.insertRow(item)
    editor.stopOperation()
    editor.stopEditing(True)
except:
    #do stuff

#Delete the scratch folder and scratch GDB
#If working with database from ArcMap interface
if "C:\Users" in scratchFolder:
    arcpy.AddMessage("Deleting scratch folder...")
    shutil.rmtree(scratchFolder)
    arcpy.Compact_management(scratchGDB)
    arcpy.AddMessage("Deleting scratch GDB...")
    shutil.rmtree(scratchGDB)
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

These are the files that are left after I attempt to delete the FGDB (ignore the "VLC media file" - it's just Windows assuming it knows what program to open .spx files in)

0 Kudos
1 Solution

Accepted Solutions
JamesMacKay3
Occasional Contributor

Try deleting your editor object as well (using del) and rather than using rmtree to delete your FGDB, try using the arcpy.Delete_management geoprocessing tool.

View solution in original post

6 Replies
JoshuaBixby
MVP Esteemed Contributor

With your cursors prior to this code, are you deleting the cursor objects after you are done with them?  Using a Python with statement will address locks, or most of them, but the cursor object still exists after the with statement executes and there could be file handles still open.

Esri's initial documentation of using with statements was very misleading, I submitted feedback and logged documentation defects to point out that using with doesn't guarantee all locks and file handles are released.  Esri finally updated the documentation with 10.5, SearchCursor—Help | ArcGIS Desktop :

Search cursors can be iterated using a for loop. Search cursors also support with statements to reset iteration and aid in removal of locks. However, using a del statement to delete the object or wrapping the cursor in a function to have the cursor object go out of scope should be considered to guard against all locking cases.

MKF62
by
Occasional Contributor III

Thanks, I will give that a shot because I am not using "del" statements right now after cursors. I also discovered through debug that some locks are actually persisting but disappear once the script completes with an error (which makes perfect sense now that I think about it...).

0 Kudos
MKF62
by
Occasional Contributor III

Alright - so close to working. Now I am left with two files right before the script errors out and it's getting hung up on deleting the "timestamps" file saying something else is accessing it. Right before the script errors out I have a _gdb lock file, it disappears once I close the console window where the script is running (which effectively stops the script).

0 Kudos
JoshuaBixby
MVP Esteemed Contributor
#Delete the scratch folder and scratch GDB
#If working with database from ArcMap interface

What if you run the code outside of ArcMap entirely?  I ask because I have seen ArcMap/ArcCatalog keep handles on the timestamps files in the GDBs, so regardless of what you are doing with ArcPy those files can't be deleted until ArcMap/ArcCatalog releases them.

0 Kudos
JamesMacKay3
Occasional Contributor

Try deleting your editor object as well (using del) and rather than using rmtree to delete your FGDB, try using the arcpy.Delete_management geoprocessing tool.

JoshuaBixby
MVP Esteemed Contributor

Great point, especially about using ArcGIS's own tool to remove the GDB.

0 Kudos