arcpy.mp.ArcGISProject .aprx lock issue

3125
18
03-20-2024 10:09 AM
__JackCharde__
Regular Contributor

Hey Everyone,

I wrote a python script for a client that extracts a project template to a new folder, manipulates the data and layouts in that new project, creates PDFs, saves a copy of that extracted project file with the current month/year, and then tries to delete the extracted project. The deletion always fails, and my VS Code terminal or the GP tool I built the script for says it's because the project is open somewhere.

I have tried deleting every variable that references the project, maps, layouts, elements, all arcpy.mp classes for that matter, and it still always fails.

Even if I close Pro where I ran the GP tool, or close VS Code where I was debugging the script and then go to File Explorer to delete the extracted Pro project, File Explorer says I can't because the file is open somewhere, which it is not when looking at my computer. 

Has anyone encountered this, and/or know a way around it?

Thanks!

- Jack C

18 Replies
MeghanBlair
Occasional Contributor

Did you ever find a solution? I have multiple projects locked in read-only after automating with arcpy.mp.ArcGISProject() function. I have tried del on the objects but its not working. 

 

Robert_LeClair
Esri Esteemed Contributor

No solution as of yet.  Question - are you using a file geodatabase or an enterprise geodatabase?

0 Kudos
__JackCharde__
Regular Contributor

Hey Meghan,

No, I did not find a solution. I encountered this same issue, re: multiple locked projects. I have an output directory where all projects go when my GP tool is run, and I would find that if I executed multiple test runs back-to-back, each folder in that directory created by each test run could not be deleted because somehow they all had files open somewhere, even though Pro was not open, and neither was an active debugging session in my IDE where I developed the tool's script.

0 Kudos
MeghanBlair
Occasional Contributor

Enterprise geodatabase

0 Kudos
ElisseDeleissegues1
Occasional Contributor

I am also having this issue, running .py locally via automated trigger. Currently, there are approximately 12 scripts that are all similarly configured and there are 4 of them that have gotten all the way to the directive to overwrite an existing feature service and failed.

In troubleshooting (nothing has changed recently, these have been running automatically for over a year without issue) the only common factor that I can find is that those .aprx projects are locked as [READONLY].

I am able to manually open each project and overwrite the feature layers but that defeats the point of automation.

How do I remove the lock on these projects?

0 Kudos
NseaGIS
Occasional Contributor

I'm having the same issue where I delete all aprx-related variables at the end of my script, the script completes, and aprx is still locked when I run aprx.save(). Seriously, ESRI? You think that restarting my python shell is an acceptable solution? Ever heard of scheduled tasks? Because locks seriously get in the way of that when the script is run automatically.

I am simply using .tif files in my .aprx project.

Josh-R
by
Regular Contributor

I've been encountering issues with feature classes in geodatabases and arcgis pro projects remaining locked during processing for as long as I can remember. Deleting related variables has only ever worked for shapefiles, not feature classes within a geodatabase. ESRI should definitely provide users with more control over these lock files with some kind of toggleable, use-at-your-own-risk type of setting.

That said, there is a work-around. With some clever coding you can terminate the active script (which releases the locks) and restart it where you left off by using the subprocess module and a system variable.

import sys
import subprocess

def task1():
	print('--begin task 1')
	
def task2():
	print('--begin task 2')

if len(sys.argv) > 1 and sys.argv[1] == "task1complete":
    task2()
    print('--script finished')
else:
    print('--begin script')
    task1()
    subprocess.Popen([sys.executable] + sys.argv + ["task1complete"], close_fds=True) # relaunch the script
    sys.exit() # ensure the original script closes and locks release

 

0 Kudos
DavidTillberg_community
Occasional Contributor

One way is to do all of your work in a subprocess call.  When the subprocess exits, the process will be destroyed and locks will be gone.  Another thing you can try is import gc (python's garbage collection module), then call gc.collect().  Python may not be cleaning up the objects even after you've deleted them.  This method doesn't guarantee that it will clean up all objects though.

0 Kudos
CMorneau
Occasional Contributor

Hello Robert.  No apologies needed.  I was not offended in anyway.  Again, I greatly appreciate the insight provided in your 03-20-2024 post.