Select to view content in your preferred language

Publishing Geoprocessing Service that use Feature Service URLs

2728
12
04-07-2022 08:28 AM
ZacharyHart
Honored Contributor

I've developed a script tool which I'm looking to publish as a GP service. The script leverages several feature service URLs from our Enterprise system hard-coded inputs for the script; using direct DB connections to the under lying database isn't a viable option.

Obviously, you cannot/do not register server URLs with your Data Store, but since the validation process cannot find these resources in the Data Store, it warns that the data source is not registered and will therefore be copied.

 

inFeatures = "https://myserver.mydomain.com/server/rest/services/Folder/FeatureServiceName/FeatureServer/1"

 

I thought that maybe I could simply parameterize these URLs in the Script Tool but I'm not sure that you can make these hidden or locked.

Is there anyway to go about publishing this without using a direct DB connection?

EDIT: I'm suddenly remembering something about registering a folder that contains a valid AGS connection file...hmmm

UPDATE: No dice. I saved a server connection file (.ags) to a folder registered with the data store, but I've am not able to use this input in any code. For example, making a feature layer from the REST endpoint for the layer within the feature service using the .ags file:

 

testLayer = arcpy.management.MakeFeatureLayer("\\\\sharedLocation\\Folder\\AGSConnections\\connectionFile.ags\\serverFolder\\featureService.FeatureServer\\layer","test")

 

[Note this is the format used if you drag the layer from ags connection file in the catalog pane to the Python window in Pro.]

0 Kudos
12 Replies
DougBrowning
MVP Esteemed Contributor

Did you ever figure this out @ZacharyHart ?

0 Kudos
samc-gis
Emerging Contributor

I experienced this issue myself when trying to set up a local script as a GP service - I foolishly thought that the validation would "just know" that the hard-coded feature services were feature services and not local data and therefore would not attempt to copy the data when publishing. I attempted to publish the script with the hard-coded URLs and I ended up having to kill the publishing job as we all know it attempted to copy the data to the server despite no actual local data being referenced in the script.

The issue with validation flagging feature service URLs as file directories can be avoided by storing all URLs in a YAML configuration file. Instead of embedding/hard-coding the URLs directly in the script or tool, reference them dynamically by reading from the YAML file at runtime. This way, the only file that gets copied with the GP service is the small YAML file, avoiding any validation errors related to directory structures and drastically reducing the time to publish the GP service.

For example, you can structure the YAML file like this:

portal: 
   master_fs: https://yourserver.com/FeatureServer/0
feature_services:
  other_fs1: "https://yourserver.com/FeatureServer/1"
  other_fs2: "https://yourserver.com/FeatureServer/2"
  other_fs3: "https://yourserver.com/FeatureServer/3"

Then, in your script, load the URLs dynamically:

import yaml

def read_config(config_path):
    with open(config_path, 'r') as file:
        return yaml.safe_load(file)

config = read_config("config.yaml")
master_feature_service = config["portal"]["master_fs"]
feature_service_urls = config["feature_services"]

This approach ensures that no hardcoded paths cause validation issues and makes it easier to update URLs without modifying the script itself. You can modify the YAML file published with the GP service by going to the file directory for the GP service on your server if you need to add/remove variables.

Hope this helps anyone that has been struggling with this issue!

DougBrowning
MVP Esteemed Contributor

Yes or regular old text file works also.  In the end now I just comment out my script totally, publish, then on the server uncomment it back.  Works way faster and then nothing gets repathed at all.  Took some convincing with IT but now I can skip all validation.  Script is much cleaner now and I can publish in like a min vs the 20 minutes it was taking to repath on me.  Then going forward I just make changes to the script on the server vs republishing.  I know its not great to change production like that but I only really do it for dev scripts.  Light years faster to troubleshoot the 500 other things that break in GP services.