Select to view content in your preferred language

Locks will not clear when deleting python object

515
5
Jump to solution
03-18-2026 03:08 PM
ChrisCowin_dhs
Regular Contributor

I have a script that has three for loops that are nested within each other. Top two just do a little housework but the deepest does the following:

  1. Builds paths to CSVs
  2. Reads those CSVs into pandas
  3. Deletes the working GDB if it exists 
  4. Recreates that GDB
  5. Makes a Closest Facility Analysis Layer
  6. Converts the incidents and facilities dataframe into XY Tables
  7. Loads those XY Tables into the incidents and facilities sublayers of the Closest Facility Analysis Layer
  8. Solves for routes
  9. Exports those routes as a CSV

The first run of the loop is fine, but I get into trouble when deleting the working GDB. I've tried probably two dozen different ways to get around the lock, but nothing seems to do anything:

  1. I've created variables for the closest facility layers, incident/facility XY tables, the results of the solve and the GDB itself and tried to use python's del to delete the object from the interpreter. None of that seems to do anything all the lock files are still there.
  2. I've tried arcpy.ClearWorkspaceCache_management(), that doesn't do anything all the lock files remain.
  3. I've tried just adding waits in a couple places. Didnt work.
  4. I've tried a function that uses while time.time() to wait for the lock to be released on its own thinking I'm just rushing the lock release. After 5m the locks are still there.
  5. I've tired shutil.delete instead of arcpy.Delete_management(). That does nothing.
  6. I've tried splitting the entire thing into a series of functions hoping the different namespaces will fix the issue. That does nothing.
  7. I've tried python's garbage collector built in. That does nothing.
  8. I've replicated the script from scratch trying to figure out when are where the locks occur from and which one is causing the issue. That script works just fine, no errors when I'm functionally doing the same thing.

I'm just absolutely at my wits end on trying unsuccessfully to manage these locks. Nothing works and all the ESRI resources do not do what ESRI says they are supposed to do or they explain them so poorly that it doesn't get through my thick skull.

So, the locks that exist at the end of the first loop are:

  1. _gdb.sr.lock
  2. Closest Facility Analysis Sublayers
    1. Barriers.sr.lock
    2. CFRoutes.sr.lock
    3. Facilities.sr.lock
    4. Facilities.rd.lock
    5. Incidents.sr.lock
    6. Incidents.rd.lock
    7. PolygonBarriers.sr.lock
    8. PolylineBarriers.sr.lock

How am I supposed to remove these?

# =============================================================================
# Libraries
# =============================================================================
import pandas as pd
import numpy as np
import arcpy
import time
import shutil
import os
import gc

arcpy.env.overwriteOutput = True
pd.options.display.max_columns = None

# =============================================================================
# CONFIG
# =============================================================================

mce_list = [
    # 'ADV HEALTH',
    # 'ALLCARE',
    # 'CHA',
    # 'CPCCO',
    # 'EOCCO',
    # 'HEALTHSHARE',
    # 'IHN',
    # 'JCC',
    'PSCS-CG',
    'PSCS-CO',
    'PSCS-LN',
    'PSCS-MP',
    # 'TRILLIUM-SW',
    # 'TRILLIUM-TC',
    # 'UHA',
    # 'YCCO'
    ]

Plan_to_services = {
    'CCOA' : ['Physical', 'Behavioral', 'Dental'],
    'CCOB' : ['Physical', 'Behavioral'],
    'CCOE' : ['Behavioral'],
    'CCOF' : ['Dental'],
    'CCOG' : ['Behavioral', 'Dental'],
}

Specialty_to_service = {
    # 'Allergy and Immunology' : 'Physical',
    # 'Cardiology': 'Physical',
    # 'Dermatology': 'Physical',
    # 'Durable Medical Equipment': 'Physical',
    # 'Endocrinology': 'Physical',
    # 'ENT or Otolaryngology': 'Physical',
    # 'Gastroenterology': 'Physical',
    # 'Gynecology': 'Physical',
    # 'Hematology': 'Physical',
    'Hospital': 'Physical',
    'Mental Health Provider': 'Behavioral',   #Behavioral
    'Mental Health Residential': 'Behavioral', #Behavioral
    'Methadone Clinic': 'Physical',
    'Nephrology': 'Physical',
    'Neurology': 'Physical',
    'Obstetrics': 'Physical',
    'Occupational Therapy': 'Physical',
    'Oncology Medical': 'Physical',
    'Oncology Radiation': 'Physical',
    'Ophthalmology': 'Physical',
    'Optometry': 'Physical',
    'Pharmacy': 'Physical',
    'Physical Therapy': 'Physical', 
    'Podiatry': 'Physical',
    'Primary Care': 'Physical', 
    'Primary Care Dentistry': 'Dental', #Dental
    'Psychiatry': 'Physical',
    'Psychology': 'Behavioral', #Behavioral
    'Pulmonology': 'Physical',
    'Reproductive Endocrinology': 'Physical',
    'Rheumatology': 'Physical',
    'Skilled Nursing Facility': 'Physical',
    'Speech Language Pathology': 'Physical', 
    'Substance Use Disorder Provider': 'Physical',
    'Substance Use Disorder Residential': 'Physical', 
    'Urology': 'Physical'
}

def extract_cco_plan(filename):
    for plan in Plan_to_services:
        if plan in filename:
            return plan
    raise ValueError(f'CCO Plan not found in {filename}')

def get_specialties_for_plan(plan):
    return[
        spec for spec, svc in Specialty_to_service.items()
        if svc in Plan_to_services[plan]
    ]
nds = r'{path-to-streetmappremium}'

coordinate_system = (
    'GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",'
    'SPHEROID["WGS_1984",6378137.0,298.257223563]],'
    'PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]];'
    '-400 -400 1000000000;-100000 10000;-100000 10000;'
    '8.98315284119521E-09;0.001;0.001;IsHighPrecision'
)

travel_modes = arcpy.na.GetTravelModes(nds)
travel_mode_miles = arcpy.na.TravelMode(travel_modes['Driving Time'])
travel_mode_miles.distanceAttributeName = 'Miles'

field_mappings_locations = (
    "Name Address #;CurbApproach # 0;Attr_Minutes # 0;"
    "Attr_TimeAt1KPH # 0;Attr_TravelTime # 0;"
    "Attr_TruckMinutes # 0;Attr_TruckTravelTime # 0;"
    "Attr_WalkTime # 0;Attr_Kilometers # 0;"
    "Attr_Miles # 0"
)

# =============================================================================
# PATHS
# =============================================================================

path = os.getcwd()
print('Current Working Directory:', path)

staging = os.path.join(path, 'Staging Data')
os.makedirs(staging, exist_ok=True)

na_gdb_path = os.path.join(path, 'Network Adequacy.gdb')

closest_facility_path = os.path.join(na_gdb_path, 'Closest Facility')
routes = os.path.join(closest_facility_path, 'Routes')
incidents = os.path.join(closest_facility_path, 'Incidents')
facilities = os.path.join(closest_facility_path, 'Facilities')

routes_gdb = os.path.join(na_gdb_path, 'Routes_Export')
incidents_gdb = os.path.join(na_gdb_path, 'Incidents_Export')
facilities_gdb = os.path.join(na_gdb_path, 'Facilities_Export')

routes_csv = os.path.join(staging, 'Routes_ExportTable.csv')
incidents_csv = os.path.join(staging, 'Incidents_ExportTable.csv')
facilities_csv = os.path.join(staging, 'Facilities_ExportTable.csv')

# =============================================================================
# HELPER FUNCTIONS
# =============================================================================

def safe_delete_csv(path):
    if os.path.exists(path):
        os.remove(path)

def export_gdb_to_csv(in_table, gdb_table, csv_path):
    """Lock-safe export: FeatureClass/Table → GDB → temp CSV → rename"""
    if arcpy.Exists(gdb_table):
        arcpy.Delete_management(gdb_table)

    arcpy.conversion.ExportTable(in_table, gdb_table)

    temp_csv = csv_path.replace('.csv', '_tmp.csv')
    safe_delete_csv(temp_csv)

    tbl = arcpy.conversion.TableToTable(
        gdb_table,
        os.path.dirname(csv_path),
        os.path.basename(temp_csv)
    )

    safe_delete_csv(csv_path)
    os.rename(temp_csv, csv_path)
    del tbl

# =============================================================================
# MAIN LOOP
# =============================================================================

subccos = ['A', 'B', 'E', 'F', 'G']

for subcco in subccos:
    for mce in mce_list:
        members_csv = os.path.join(path, f'Q3 2025/01 Members Output Adult CCO Plan/{mce}_CCO{subcco}_AdultMembers.csv')
        cco_plan = extract_cco_plan(members_csv)
        spec_list = get_specialties_for_plan(cco_plan) 
        
        print(f'\nMCE: {mce}')
        print(f'CCO Plan: {cco_plan}')
        print(f'Specialties used: {spec_list}')
        
        for spec in spec_list:
            print(f'\nProcessing {mce} - {spec}')

            routes_csv = os.path.join(
                staging,
                f"Routes_ExportTable_{spec.replace('-', '_')}_{mce.replace('-', '_')}.csv")
            
            facilities_csv_in = os.path.join(
                path, f'Q3 2025/02 Facilities Output Adult/{mce}_{spec}_Facilities.csv'
            )
            
            output_csv = os.path.join(
                path, f'Q3 2025/03 Routing Output Adult/{mce}_{spec}_CCO{subcco}_Routes.csv'
            )

            members_df = pd.read_csv(members_csv, dtype=str)
            facilities_df = pd.read_csv(facilities_csv_in, dtype=str)

            if 'Age_Group' not in facilities_df.columns:
                print("Warning: Age_Group column not found in facility data. Skipping Age_Group.")

            print(f'routing {len(members_df)} incidents to {len(facilities_df)} facilities...')
            if len(facilities_df) == 0:
                print('No facilities found – writing empty output.')
                pd.DataFrame({
                    'MCE_Name': mce,
                    'Specialty': spec,
                    'Incident_Match_ID': members_df['Incident_Match_ID'],
                    'Facility_Match_ID': np.nan,
                    'Driving_Minutes': np.nan,
                    'Driving_Miles': np.nan
                }).to_csv(output_csv, index=False)
                
            if arcpy.Exists(na_gdb_path):
                try:
                    del na_gdb
                except Exception as e:
                    print(e)

                arcpy.Delete_management(na_gdb_path)

            na_gdb = arcpy.management.CreateFileGDB(path, 'Network Adequacy.gdb')
            arcpy.env.workspace = na_gdb_path
            
            print('Making a new Closest Facility layer...')
            closest_facility = arcpy.na.MakeClosestFacilityAnalysisLayer(
                nds, closest_facility_path, travel_mode_miles
            )

            incidents_xy_path = os.path.join(na_gdb_path, 'Incidents_XY')
            facilities_xy_path = os.path.join(na_gdb_path, 'Facilities_XY')

            print('Building Incidents XY Table')
            incidents_xy = arcpy.management.XYTableToPoint(
                members_csv, incidents_xy_path, 'X', 'Y', None, coordinate_system
            )
            
            print('Building Facilities XY Table')
            facilities_xy = arcpy.management.XYTableToPoint(
                facilities_csv_in, facilities_xy_path, 'X', 'Y', None, coordinate_system
            )
            
            print('Adding Incidents XY Table to Analysis Layer')
            arcpy.na.AddLocations(closest_facility_path, 'Incidents', incidents_xy_path)
            
            print('Adding Facilities XY Table to Analysis Layer')
            arcpy.na.AddLocations(closest_facility_path, 'Facilities', facilities_xy_path)

            print('Solving')
            solve = arcpy.na.Solve(closest_facility_path, 'SKIP', 'TERMINATE')

            # EXPORTS (SAFE)
            print('Exporting Routes / Incidents / Facilities')
            if arcpy.Exists(routes_csv):                       
                    arcpy.Delete_management(routes_csv)

            export_gdb_to_csv(routes, routes_gdb, routes_csv)
            export_gdb_to_csv(incidents, incidents_gdb, incidents_csv)
            export_gdb_to_csv(facilities, facilities_gdb, facilities_csv)

            try:
                del closest_facility
                del incidents_xy
                del facilities_xy
                del solve
                del na_gdb
            except Exception as e:
                print(e)

 

0 Kudos
1 Solution

Accepted Solutions
RhettZufelt
MVP Notable Contributor

Not sure if this is your issue, but, setting arcpy.env.workspace puts a lock on the FGDB.

I found that setting arcpy.env.workspace = None, does not release it, but followed by arcpy.ClearWorkspaceCache_management() does.

Also, doesn't appear to be any reason to delete the FGDB right before re-creating it.  Have you tried deleting it at the very end of the script after you have deleted all the variables?

Also, in my testing, even if there is a _gdb.sr.lock (did not test with other locks), this code will delete it anyway:

def ToT(ds):
    if arcpy.Exists(ds):
        print(f"{ds} exists, deleting...", end="")
        try:
            arcpy.Delete_management(ds)
            print("successful")
        except Exception as e:
            print(f"unable to delete, reason: {e}")
    else:
        print(f"{ds} doesn't exist, skipping...")

ToT(na_gdb_path)

In case this helps,

R_

View solution in original post

5 Replies
RhettZufelt
MVP Notable Contributor

Not sure if this is your issue, but, setting arcpy.env.workspace puts a lock on the FGDB.

I found that setting arcpy.env.workspace = None, does not release it, but followed by arcpy.ClearWorkspaceCache_management() does.

Also, doesn't appear to be any reason to delete the FGDB right before re-creating it.  Have you tried deleting it at the very end of the script after you have deleted all the variables?

Also, in my testing, even if there is a _gdb.sr.lock (did not test with other locks), this code will delete it anyway:

def ToT(ds):
    if arcpy.Exists(ds):
        print(f"{ds} exists, deleting...", end="")
        try:
            arcpy.Delete_management(ds)
            print("successful")
        except Exception as e:
            print(f"unable to delete, reason: {e}")
    else:
        print(f"{ds} doesn't exist, skipping...")

ToT(na_gdb_path)

In case this helps,

R_

ChrisCowin_dhs
Regular Contributor

@RhettZufelt wrote:

Not sure if this is your issue, but, setting arcpy.env.workspace puts a lock on the FGDB.

I found that setting arcpy.env.workspace = None, does not release it, but followed by arcpy.ClearWorkspaceCache_management() does.


This doesn't seem to clear the offending lock in my case.


@RhettZufelt wrote:

Also, doesn't appear to be any reason to delete the FGDB right before re-creating it.  Have you tried deleting it at the very end of the script after you have deleted all the variables?

Also, in my testing, even if there is a _gdb.sr.lock (did not test with other locks), this code will delete it anyway:

So, this works, but only because it just doesn't delete the GDB in the try portion of the code which still seems fundamentally broken to me. I should be able to delete a GDB when I want to and, seemingly, it is impossible to delete a GDB you have used at all in the script at any point.

0 Kudos
AlfredBaldenweck
MVP Frequent Contributor

What happens if you delete everything in the GDB first?

0 Kudos
RhettZufelt
MVP Notable Contributor

I guess that is what I was refereeing to at the 'end of the script', would be the last entry in the try loop.

            try:
                del closest_facility
                del incidents_xy
                del facilities_xy
                del solve
                del na_gdb
                ToT(na_gdb_path)
            except Exception as e:
                print(e)

But, that doesn't work either?

R_

0 Kudos
HaydenWelch
MVP Regular Contributor

If you want to modify anything in the environment, it's good practice to use EnvManager:

from arcpy import EnvManager

with EnvManager(workspace='<path/to/workspace>'):
    # Do workspace ops here

# arcpy.env.workspace is reset to default/cleared

 

I've found that doing basically anything with global state in a complex script creates a huge amount of footguns. It's best to keep your context local to your operations.

 

Clearing the cache is also a good practice. You can subclass the existing ENV manager and re-implement its __exit__ method to clear the active cache before resetting:

from arcpy import EnvManager
from arcpy.management import ClearWorkspaceCache
import arcpy

class ClearEnvManager(EnvManager):
    def __exit__(self, exc_type, exc_value, traceback):
        ClearWorkspaceCache(arcpy.env.workspace)
        return super().__exit__(exc_type, exc_value, traceback)