I am attempting to publish a geoprocessing tool (referenced as "Delineate Core" below) from ArcGIS Desktop 10.4.1 to ArcGIS Server 10.3 (for Windows) using the workflow described in the documentation, and am running into some rather bizarre errors when analyzing the service that are keeping it from publishing. They are as follows:
SEVERITY | STATUS | CODE | DESCRIPTION | NAME | TYPE | DATA FRAME |
Error-High | Unresolved | 178 | Data: C:\Windows\bfsvc.exe, used by Script Delineate Core cannot be copied to the server | delineate | Tool | Unknown |
Warning-High | Unresolved | 24032 | Data source used by Script Delineate Core is not registered with the server and will be copied to the server: C:\dev\null | delineate | Tool | Unknown |
Warning-High | Unresolved | 24032 | Data source used by Script Delineate Core is not registered with the server and will be copied to the server: C:\ | delineate | Tool | Unknown |
I'm at a bit of a loss. bfsvc.exe is a system executable (note that it is the first encountered alphabetically in the Windows install directory). I don't have a "C:\dev\null" directory. And I'm not sure why it would be looking to copy my entire "C:\" drive. None of these things are referenced in my geoprocessing tool.
The analysis process also takes about 15 minutes to run, which makes me think that something in either my script or the way that I'm publishing the service is causing the Analyze process to look all over the place for...what? I don't know.
A rundown of my geoprocessing tool:
I attempted a publication workflow with the tool located on my "D:\" drive. I didn't get error code 178, but still got references to a non-existent "D:\dev\null" location and the entire "D:\" drive, similar to the table above. I gave publishing a go in that case (why not, since no errors where holding it back), but the it hung up.
Where do I continue troubleshooting?
Many thanks in advance,
Christian
Solved! Go to Solution.
I was able to resolve the issue with some experimentation and further digging on GeoNet (though it doesn't appear many people have encountered this and posted up about it). A couple of threads led me to something that appears to be working:
The first thread highlights a similiar issue with a GP service pre-publication analysis throwing a 00178 error. That thread referenced the second thread. Both identified an issue with how import statements were constructed in the python script. The solutions identified were to make more specific calls to individual functions, classes, etc., within the imported modules.
In my script tool, I was importing arcpy like this:
import arcpy
Pretty standard, I would think. I replaced that line with specific imports to the modules in use:
from arcpy import env
from arcpy import GetParameterAsText, SetParameterAsText, GetParameterInfo
from arcpy import AddMessage, AddWarning, AddError
from arcpy import Raster, Describe, CreateUniqueName
from arcpy import Intersect_analysis, Statistics_analysis, Select_analysis, AddField_management, CalculateField_management
from arcpy import FeatureToRaster_conversion, RasterToPolygon_conversion, JSONToFeatures_conversion, FeaturesToJSON_conversion
from arcpy.sa import Watershed, Con, IsNull, SetNull
from arcpy.da import SearchCursor, UpdateCursor
The GP pre-publication analysis process worked quickly and as expected: no bizzare attempt to import system executables or source data from non-existent directories.
I'm not sure if
from arcpy import *
...would have yielded the same result, but as this took several days for me to resolve I'm not inclined to try anything else.
Hopefully someone else finds this useful. Cheers!
Is the tool that you are publishing available at 10.3 also? Usually you want to publish from a matching version of DT and Server to avoid any functionality issues.
It is - I originally wrote it using 10.3 desktop. That sounds as good a starting point as any though. I'll revert back to 10.3 and post up here the results.
Cheers!
I downgraded to 10.3 desktop -- clean uninstall, wiped the user data directories, etc. Basically a fresh, unencumbered install of ArcGIS Desktop (first time in a while, and frankly it needed it). Unfortunately the issue appeared, same as before. But at least I can rule out a version incompatibility issue, and any other potential funkiness that may have been introduced through my normal.mxt or my other customizations. Thanks!
Another thing to look into is how you construct paths within your script. I suggest using the os module and os.path.join to create your directory paths for the script, as that can help to avoid the errors you're seeing.
For all scratch outputs I have a small shortcut function that relies on arcpy.CreateUniqueName and the scratchWorkspace environment variable to handle filenames and locations:
def so(temp_file_name, in_mem=False):
"""scratch output full path generator"""
if in_mem:
return arcpy.CreateUniqueName(temp_file_name, "in_memory")
else:
return arcpy.CreateUniqueName(temp_file_name, arcpy.env.scratchWorkspace)
arcpy.env.scratchWorkspace = arcpy.env.scratchGDB
The scratch workspace is set to the scratch geodatabase. The issue also presented itself when I wrote to the in_memory workspace, too.
I will try swapping out the guts of this function for one that uses the system path construction functions instead, and see if that makes a difference.
Unfortunately, no luck with the approach of using os.path functions to construct paths to scratch data. Same errors from Analyze as before.
I'm going back through the documentation to see if I missed anything critical. One thing I noticed here was this bit about how the script tool is scanned to discover any project data used in the script:
"When your script is scanned, every quoted string (either single- or double-quotes) used in a Python variable or as an argument to a function is tested to see if it is a path to data that exists."
I have a lot of quoted strings in my script: references to column names and field values; doc strings; info, warning, and error messages; parameters for arcpy tools (e.g,. arcpy.conversion.RasterToPolygon(input, output, "SIMPLIFY") ); and...some empty strings (used as placeholders in data objects for values to be written by another application).
If this scanning process always happens, that might explain why Analyze takes 10-15 minutes to complete, and perhaps why the process treats "" as a location that data needs to be added from. Could this explain the errors?
What I maybe don't understand is that I've followed the convention for parameterizing inputs and outputs described here. I think that is saying is that the script shouldn't be scanned if the inputs/outputs are all set as parameters, which I've done.
---
Unrelated to the above - what I've also noticed is that all the examples in the documentation use os.path functions to manually construct paths to data, rather than using arcpy.CreateUniqueName() and/or the arcpy.env.scratchWorkspace variables to let ArcGIS sort it out. I will give that a go for my scratch data and see if that helps, as per my other comment.
Thanks in advance for any insight,
Christian
I was able to resolve the issue with some experimentation and further digging on GeoNet (though it doesn't appear many people have encountered this and posted up about it). A couple of threads led me to something that appears to be working:
The first thread highlights a similiar issue with a GP service pre-publication analysis throwing a 00178 error. That thread referenced the second thread. Both identified an issue with how import statements were constructed in the python script. The solutions identified were to make more specific calls to individual functions, classes, etc., within the imported modules.
In my script tool, I was importing arcpy like this:
import arcpy
Pretty standard, I would think. I replaced that line with specific imports to the modules in use:
from arcpy import env
from arcpy import GetParameterAsText, SetParameterAsText, GetParameterInfo
from arcpy import AddMessage, AddWarning, AddError
from arcpy import Raster, Describe, CreateUniqueName
from arcpy import Intersect_analysis, Statistics_analysis, Select_analysis, AddField_management, CalculateField_management
from arcpy import FeatureToRaster_conversion, RasterToPolygon_conversion, JSONToFeatures_conversion, FeaturesToJSON_conversion
from arcpy.sa import Watershed, Con, IsNull, SetNull
from arcpy.da import SearchCursor, UpdateCursor
The GP pre-publication analysis process worked quickly and as expected: no bizzare attempt to import system executables or source data from non-existent directories.
I'm not sure if
from arcpy import *
...would have yielded the same result, but as this took several days for me to resolve I'm not inclined to try anything else.
Hopefully someone else finds this useful. Cheers!
Christian Gass Thanks a lot. I faced the same problem and your solution worked for me