Deleting File Geodatabase with locks

6440
11
06-13-2012 08:04 PM
Highlighted
New Contributor III
Hi,

I've come across many posts about lock files for both personal and file geodatabases, but I guess my situation is slightly different in that I need to delete a file geodatabase so I can replace it with a new one.  I run a maintenance script at 3am on Sunday mornings, so no users should be using this then (unless they're super-dedicated to their work), but you know how people always leave their computers on overnight.  As these databases are really just there to be read, not edited, I can delete them.  The problem is that the locks prevent me from doing it and the script then has to wait until the next week to update the database, if nobody has it locked then.

My question is whether there is a way to simply delete the entire database, regardless of whether it has a lock.  I'm assuming Esri functions won't do this, but is there a way I can just add some sort of Python subprocess to force the delete or maybe a way I can kick that user off the license manager?  I'm aware that we can use the lmremove function, but when we move from 9.3.1 to 10.1 at the end of the year, we won't be able to that anymore and also, I can't access the user's processes remotely to kill the active license.

I'm really not trying to edit or save the database in any way.  I just want to strong-arm the delete.

Cheers

Dean
11 Replies
Highlighted
MVP Esteemed Contributor
Could you delete it at a system level?  I'm not sure where the lock file is stored, but perhaps by popping the fgdb/directory you'd get the whole thing in one.
Reply
0 Kudos
Highlighted
Esri Esteemed Contributor
Yes, you can recursively delete the .gdb directory at the system level, but I worry
about the applications which had created locks... That won't do them any good.

- V
Reply
0 Kudos
Highlighted
New Contributor III
Thanks for the replies.

That's what I'm currently trying.  I'm using shutil.rmtree to delete the entire folder.  When there are no .lock in the folder, it works fine, but if there are, then it'll start deleting some of the files in the folder and fall over on the files those locks are holding on to as well as the lock files themselves.  This then leaves the whole GDB useless and a headache to try and get the last files deleted.  Because ArcGIS can't access the GDB anymore, to refresh and successfully release that lock, it's an absolute mission to get rid of them.  I had to wait for a server restart the last time, which meant that users had no access to the database.

As a fallback I've thought about checking for lock files first, then using the Python sleep function to delay any processing and rechecking until the lock files are gone and then deleting the FGB, before starting the export from SDE to the FGDB.  The problem is that if someone does have their machine on overnight, then it'll mean that this check will run into the next day and as the export takes a few hours to create and copy over the network, it'll mean that users might not have any database for a few hours the following day.  I could of course add something that says, if the checks go past a certain time, then abort it.  Then just have this task run daily and check for the time it was last run to make sure it's not updating it too frequently.

Anyway, having the ability to just delete the database, will save me a lot of time and effort.  I'm really not concerned about the applications that have connections to it.  The ArcMap sessions they have connected are read only and frankly, they have been told not to keep them active overnight.  Everyone's happy with this arrangement, but you do get people just plainly forgetting.
Reply
0 Kudos
Highlighted
MVP Esteemed Contributor
T

...they have been told not to keep them active overnight.  Everyone's happy with this arrangement, but you do get people just plainly forgetting.


System admins lament.  I have been the 'benevolent s-a' over the years but as I get older, I get grumpier;  no more phone calls asking users to get off.  I just stop and start the service.  Funny thing is, then my phone starts ringing....
Reply
0 Kudos
Highlighted
New Contributor III
I am trying to do something similar when running Python scripts that check the integrity of our tools. I'm trying to build the ability to restore test data to it original state after a test passes or fails but for some of our tools the locks in the database remain even after the geoprocessing has finished. It's only when the Python interpreter process is killed that the locks release. That means I cant restore the data without exiting the script runner. I've been trying to find a way to either run the tool on a seperate python process or somehow remove the locks but I cant seem to solve the problem. Any ideas?
Reply
0 Kudos
Highlighted
New Contributor III
It's only when the Python interpreter process is killed that the locks release. That means I cant restore the data without exiting the script runner. I've been trying to find a way to either run the tool on a seperate python process or somehow remove the locks but I cant seem to solve the problem. Any ideas?


This is just a shot in the dark, but have you tried deleting arcpy, having the script wait a bit and then loading it again and continuing?  You can try this by using a timer for the delay.

I haven't treid this myself, so I'm not sure whether deleting arcpy will remove the locks, but it's worth a try.
Reply
0 Kudos
Highlighted
MVP Regular Contributor
If you are still having issues with deleting a geodatabase on the file system despite the fact that the .lock files no longer exist, it's possible there is still a process lock on one or more of the files within it.  That being said, if you're using Windows Server 2008 / Windows 7 or later you can try one of the following methods:

1. From the Start menu, type FSMGMT.MSC, then multi-select the files you want from the GUI, then right click them and chose "Close".  That method should force close the files that are technically still open due to a process lock. 

or

2. From a batch file, run the following (example is for a file geodatabase) to close a file named a00000225.gdbtable:

cd C:\this_server\directory\subdirectory
for /f "skip=4 tokens=1,2*" %%a in ('net files') do if %%b == C:\this_server\...\a00000225.gdbtable net file %%a /close


You can modify the command above to loop through all of the files in the file geodatabase to close them all rather than specify them individually, which would be tedious since there are so many. 

From the command prompt, type NET FILES to see what the underlined text above should contain. 

Remember that double percent characters are required for batch files (in other words, %% rather than %) but single percent characters are used when running the command outside of a batch script.
Highlighted
MVP Esteemed Contributor

Thanks for posting about the FSMGMT.MSC command.  I'm been struggling with a script for a day and a half, with a rename (or copy/delete) just giving me a generic error.  Tried the manual process, and even command line (DOS) delete with no luck.  Thought I had it figured out this am (rebooted my machines)...started moving forward on the script, and then I ran into issues and reboots didn't work...ugh..  Thought I was the only one that would be accessing the fgdb, but with the command above, I was able to find who had a lock on the timestamps file in an otherwise deleted/empty fgdb folder.  So thanks....old thread but still very helpful!

Reply
0 Kudos
Highlighted
New Contributor III

Hi All, I just asked my systems admin what he does to forcibly close files if he needs to. He told me to google PsTools. The specific tool is called psfile.exe. I use C# arcObjects to accomplish data processing automation and I was able to use this before Compacting a file geodatabase that we use in map services (always has locks). Here's an example of what I run:

// forcibly close files in WPUB so we can compact it
System.Diagnostics.Process P = new Process();
System.Diagnostics.ProcessStartInfo si = new ProcessStartInfo();
si.WindowStyle = ProcessWindowStyle.Hidden;
si.FileName = "cmd.exe";
si.Arguments = "T:\\GIS\\scripts\\PSTools\\psfile.exe \\\\gis10 F:\\gis10$\\wpub\\WPUB.gdb -c -nobanner";
si.RedirectStandardOutput = true;
si.UseShellExecute = false;
P.StartInfo = si;
P.Start();

Then I run the compact immediately after. It works like a charm.

Reply
0 Kudos