I've developed a script tool which I'm looking to publish as a GP service. The script leverages several feature service URLs from our Enterprise system hard-coded inputs for the script; using direct DB connections to the under lying database isn't a viable option.
Obviously, you cannot/do not register server URLs with your Data Store, but since the validation process cannot find these resources in the Data Store, it warns that the data source is not registered and will therefore be copied.
inFeatures = "https://myserver.mydomain.com/server/rest/services/Folder/FeatureServiceName/FeatureServer/1"
I thought that maybe I could simply parameterize these URLs in the Script Tool but I'm not sure that you can make these hidden or locked.
Is there anyway to go about publishing this without using a direct DB connection?
EDIT: I'm suddenly remembering something about registering a folder that contains a valid AGS connection file...hmmm
UPDATE: No dice. I saved a server connection file (.ags) to a folder registered with the data store, but I've am not able to use this input in any code. For example, making a feature layer from the REST endpoint for the layer within the feature service using the .ags file:
testLayer = arcpy.management.MakeFeatureLayer("\\\\sharedLocation\\Folder\\AGSConnections\\connectionFile.ags\\serverFolder\\featureService.FeatureServer\\layer","test")
[Note this is the format used if you drag the layer from ags connection file in the catalog pane to the Python window in Pro.]
Can you please post your code snippet to show how you are using defining the URL and using it to access the data store?
I'm simply referencing them to the Feature Service REST endpoints like so:
inFeatures = "https://myserver.mydomain.com/server/rest/services/Folder/FeatureServiceName/FeatureServer/1"
If it is the consolidation process that is changing the string to something incorrect (I've seen this happen with directory paths but not urls), then I've been able to get around problems like that by breaking up the string literal. I'm not sure if this is relevant to your situation......
Here is an example of how I did it
# To get around a problem with consolidation we break the template directory name apart
# to trick the consolidator so it doesn't try to change the path (for shared projects the
# consolidator puts the files in the right place but generates the wrong path literals)
template_rptx = os.path.dirname(__file__) + os.path.sep + 'report' + '_' + 'templates' + os.path.sep + 'CCAA_Report.rptx'
It's not during consolidation, it's during validation because it's evaluating the URL as a data source registered with the data store or not. And since the data store does not register valid REST endpoints but only direct DB connections, this will raise an error and advise to copy the data to the server (a silly circular reference).
That said, the faking the string as you suggested is intriguing and I will try that to see if it sneaks by the validation.
It's odd that this isn't accounted for since Python web API would happily use a feature service URL even in a federated environment and you could share (and possibly publish, I've not tried) that notebook but for a script tool you cannot.
Obviously the script tool works just fine while using Pro (it's not necessary to add the layers to your session but only that Pro is logged into Portal).
I can flip this to direct GDB connections but it will be so so slow (remote DB in the cloud) for the initial run of the script tool just so that I can capture the result and use that to publish the GP Service.
I'm unfortunately still not having any luck with this.
Is there anyone from Esri willing to answer this? We have a number of GP tools we'd like to roll out to our users that are currently working in Pro using FS within the code but cannot be successfully published.
Why wouldn't you want to leverage the speed of FS in your code vs direct GDB connections?