Scratch GDB on server

2477
4
Jump to solution
04-13-2017 04:37 AM
AlessioDi_Lorenzo1
New Contributor II

Hi,

I noticed something looking like a bug behavior publishing geoprocessing script tools (based on python script) as geoprocessing tasks on arcgis enterprise 10.5.

I created the tool and run it on ArcMap 10.5 successfully but when I try to publish the result it fails giving an error that informs me that the copy of the data on server is not allowed (it is correct, I disabled it on purpose).

All the data I use are in folders that are registered in the server data store, but I discovered that the problem is related to the arcpy.env.scratchGDB variable! If I use it, the server tries to upload intermediate data from the local path (that is not registered as data store) instead of working with the scratch gdb created in the service.

To make the geoprocessing tool published I need to substitute all the occurence of arcpy.env.scratchGDB with the path of a registered folder. Then, to make the tool work properly on server, I need to edit the python source in C:\arcgisserver\directories\arcgissystem\arcgisinput\MyTool.GPServer\extracted\v101\mytool\mytool.py and put in place the arcpy.env.scratchGDB variable

The same script worked perfectly on ArcGIS 10.3 (both Desktop and Server)

0 Kudos
1 Solution

Accepted Solutions
KevinHibma-old
Regular Contributor III

This is not a bug, and Server is working properly based on how you've configured it and have setup your script/tool to be published.

First, you're correct on the data copy point. If you've disabled data copy to the server you can't publish something that needs to upload data as part of the service.

What's going wrong is how you've setup your scratch directory and your assumptions of registering the folder and the service behavior.

You have a few ways to configure/change your publishing to be more successful.

1) Set the scratchworkspace inside ArcMap prior to running the tool and publishing. If its writing to: C:\Users\MyUserName\Documents\ArcGIS\scratch.gdb then I can tell you haven't set it and you're letting it use the default location. A better practice is to make a scratch directory inside your current working directory, point ArcMap at that. Reference the root folder in your datastore. Now it wont copy the data when you publish. See the second screen shot here: A quick tour of authoring and sharing geoprocessing services—Documentation | ArcGIS Enterprise  The tool has been built in a directory with a ToolData and a Scratch folder. Everything is nice and contained. Reference the folder the model lives in.

2) Change your intermediate outputs to in_memory. Unless the tool absolutely needs to write to disk, writing to in_memory is generally faster. (I say generally because theres some caveats). This is probably the easiest and best option.

3) Allow data copying to the server during publishing

If you go with option 1) above you need to understand that just because you've referenced the folder during in the datastore does not mean the service will write results to this directory. A Geoprocessing Service when using %scratchGDB% or arcpy.env.scratchGDB writes to a specific JOBS folder inside the arcgisserver directory structure.

View solution in original post

4 Replies
AlessioDi_Lorenzo1
New Contributor II

Registering the scartch workspace local path as server data store seems to solve the problem (it is anyway a workaround).

I believe the problem is that the Service Editor "read" the content of the arcpy.env.scratchGDB (in my case (C:\Users\MyUserName\Documents\ArcGIS\scratch.gdb) instead accepting it as is. 

I never had the need to register the scratch path as data store with the previous versions. Can you confirm this is a bug?

Thank you

AlessioDi_Lorenzo1
New Contributor II

This version of ArcGIS for Server looks like it's full of bugs!

This morning the geoprocessing task stopped working, apparently with no reasons. No changes were made to the code or to the script tool or to the service. It now fails with an error (invalid parameters) creating a feature layer with arcpy.MakeFeatureLayer_management(my_simple_feature_class,"my_feature_layer").

The task worked with no problems until last evening.

 

Are ESRI tools are meant to be used in a production environment? 10.5 it's so unreliable! 

0 Kudos
KevinHibma-old
Regular Contributor III

This is not a bug, and Server is working properly based on how you've configured it and have setup your script/tool to be published.

First, you're correct on the data copy point. If you've disabled data copy to the server you can't publish something that needs to upload data as part of the service.

What's going wrong is how you've setup your scratch directory and your assumptions of registering the folder and the service behavior.

You have a few ways to configure/change your publishing to be more successful.

1) Set the scratchworkspace inside ArcMap prior to running the tool and publishing. If its writing to: C:\Users\MyUserName\Documents\ArcGIS\scratch.gdb then I can tell you haven't set it and you're letting it use the default location. A better practice is to make a scratch directory inside your current working directory, point ArcMap at that. Reference the root folder in your datastore. Now it wont copy the data when you publish. See the second screen shot here: A quick tour of authoring and sharing geoprocessing services—Documentation | ArcGIS Enterprise  The tool has been built in a directory with a ToolData and a Scratch folder. Everything is nice and contained. Reference the folder the model lives in.

2) Change your intermediate outputs to in_memory. Unless the tool absolutely needs to write to disk, writing to in_memory is generally faster. (I say generally because theres some caveats). This is probably the easiest and best option.

3) Allow data copying to the server during publishing

If you go with option 1) above you need to understand that just because you've referenced the folder during in the datastore does not mean the service will write results to this directory. A Geoprocessing Service when using %scratchGDB% or arcpy.env.scratchGDB writes to a specific JOBS folder inside the arcgisserver directory structure.

View solution in original post

AlessioDi_Lorenzo1
New Contributor II

Many thanks.

say I go with option 1. Is it mandatory to set the scratch workspace using the ArcMap menu (Geoprocessing > Environments) or I can set it to my tool scratch folder directly in my script?

It would be useful because I often work on different scripts and custom gp tools during the day and I'm quite sure forgetting to change the scratch environment jumping from a tool to another can be easy, at least for me.

Anyway if, for example, I write in my Python code, just below the imports, something like:

arcpy.env.scratchworkspace = r"D:\myTool\Scratch" 

it probably will be not correct when the tool run on server as GP Task, as the server to work properly want to use its own scratch folder and scratch gdb (inside the unique-id directory).

So, does exist a solution to set the ArcMap environments by Python script without affecting the server behaviour? 

0 Kudos