Not getting anything from Geoprocessing board so trying here. Thanks
On server 11.3 Scratch Workspace is not giving back the job dir from arcgisjobs dir. Instead it is giving and writing to AppData under the user name. This dir is not part of the cleanup schedule and is growing over time.
arcpy.AddMessage("Scratch Workspace " + arcpy.env.scratchWorkspace)
Scratch Workspace C:\Users\serveruser~1\AppData\Local\Temp\aim\speedtest_gpserver\j9b2d66b7a00d4a8696d2fe7973da3ba4\scratch
The post here shows it should be my arcgisjobs dir right? https://community.esri.com/t5/python-questions/geoprocessing-service-to-return-a-file/td-p/522062 I remember seeing this happen in the past I think.
I am even seeing it create the jobs dir with a scratch folder in the arcgisjobs folder and even a scratch GDB but it does not use it. I can find no way in script to find this jobs dir either?
Why is it not using arcgisjobs as configured both in the GP tool and in the server settings? Is this a bug of some sort? I cannot figure out how I can even get the job dir in script if none of these work. Scratch GDB is doing the same thing. Scratch Folder same deal it gives me the AppData dir and not the jobs dir.
thanks
Have you verified that your project has the jobdir configured correctly? I believe that the AppData/Temp location is used as a fallback if the project does not have a specified scratch location.
You could try using the EnvManager context manager to force it to use a specific directory.
Yes the jobs dir is set properly on each GP tool and in the server settings.
I could force a new root but I still would not know what the job number is. I am still stuck with no way to get the actual path to the dir for each job at runtime. I really think this was working before 11.3. It is still creating the scratch dir and scratch.gdb in the jobs dir but never uses it.
When using UNC paths for ArcGIS Server working directories, the arcgisjobs folder defined for the service is the folder where job metadata, results, outputs, etc... go, it is not the folder where a geoprocessing task/job does its work. When a task/job is running on a specific ArcGIS Server machine, there is a jobs folder created in the local temp folder that is used for processing data. When processing is complete, any results and output are copied to the arcgisjobs folder defined for the service.
For a single-machine ArcGIS Server deployment where the working directories point to local drives, the arcgisjobs folder for the service may be where the geoprocessing task/job does its actual work. (We don't deploy single-machine, so I can't say for sure)
"When processing is complete, any results and output are copied to the arcgisjobs folder defined for the service."
Wouldn't this be the output folder though?
The problem with it using the local temp folder is this is not part of the cleanup routines. It should use the scratch in the jobs folder, which then gets cleaned up on the regular. At least that would make a lot more sense.
So do I have to have my scripts self clean now? (I worry trying to delete at the end of the script will cause locking issues.) Basically the server is for sure going to run out of space all the time now. I really thought I remember it did not used to do this as I think I remember watching it create stuff in the jobs folder. Plus we have never had this issue before.
I will agree that since 11.0 the local temp folder of the account running ArcGIS Server gets bloated much faster. It isn't just jobs, we see it getting bloated with all kinds of files, and Esri does not seem to be cleaning it up. We implemented a scheduled task on each ArcGIS Server machine that runs daily and deletes any file and folder older than X number of days in the local temp folder for the service account running ArcGIS Server. Should we have to do it? No, but it is trivial to implement and it addresses the issue, so I focus my time on bigger challenges with ArcGIS Server.
Regarding jobs working directories, having the process directly interact with the UNC path for its processing would have noticeable, possibly dramatic, performance impacts; and the risk of data corruption would increase. Back in the ArcGIS Server 9.x days, Esri did try that approach, and they even had a "localJobsDirectory" setting where the service working directory could be pointed locally. Eventually, they made the local job directory the standard to improve performance and lower data corruption risks.
I just check the logs of my geoprocessing tasks running on our 11.3 server and I'm not seeing the same thing. My code uses the arcpy.env.scratchGDB setting and it appears to be putting things in the right place. Here is an example
2024-12-17 12:41:07,960 - Temporary feature class C:\arcgisserver\directories\arcgisjobs\row_as_habitat\analyzebeeoccurrence_gpserver\jd6c0f33269f649b8a0551b4440c939e5\scratch\scratch.gdb\row_buffered_centerlines_319048053
I don't have access the server so I can't say for sure if the files are getting clean up, but I'm assuming so.
That is where I am seeing the server create the scratch GDB also. But arcpy.env.scratchGDB is returning a path to AppData so it never uses it.
This got cut off above I think. He is what I get from a test GP tool. Server is 11.3.
Scratch Workspace C:\Users\userreaplace~1\AppData\Local\Temp\aim\speedtest_gpserver\jef84e091d73b42078280de5458f66360\scratch
Scratch Folder C:\Users\userreaplace~1\AppData\Local\Temp\aim\speedtest_gpserver\jef84e091d73b42078280de5458f66360\scratch
Scratch GDB C:\Users\userreaplace~1\AppData\Local\Temp\aim\speedtest_gpserver\jef84e091d73b42078280de5458f66360\scratch\scratch.gdb
Package Workspace \\loc\ServerDev\directories\arcgissystem\arcgisinput\folder\SpeedTest.GPServer\extracted\p30
Script Workspace \\loc\ServerDev\directories\arcgissystem\arcgisinput\folder\SpeedTest.GPServer\extracted\p30
Don, your ArcGIS Server site must be a single-machine site because multi-machine sites have to use network shares for the various ArcGIS Server directories. The behavior of geoprocessing services varies between local and network share folders.
Yes - that is correct.