Hi,
I am trying to publish my first geoprocessing web tool to our Enterprise. In the analyze I am getting a number of 24032 errors, "Data source used by Script X: is not registered with the server and will be copied to the server. Part of these issues seem to be related to mapped/named network drives versus UNC. I'm slowly making progress on those.
Question: Can AGOL feature layers referenced by urls be used in the geoprocessing tool? Can I make AGOL a data store for our Portal? Currently we do not have a collaboration set up yet between our organization AGOL and our Portal. We are just getting set up.
Appreciate any info as this is new territory.
Thanks in advance,
Kathy
Hello Kathy!
Reading data from publicly shared service URL's will likely not cause any issues being hard-coded into a script published as a geoprocessing web tool, though there may be some limitations with seeing all of the data you expect to pull in.
However, if the item is not publicly shared or your workflow includes any efforts to update, insert, or otherwise modify an aspect of the ArcGIS Online service layer referenced in the web tool, it would likely need to be done so as a GIS item object using the appropriate ArcGIS API for Python module; see related Support Article. It would need to be accessed with the credentials of an account with sufficient permissions (usually owner or admin) to make the proposed edits. For instance:
from arcgis.gis import GIS
gis=GIS([SelectAuthenticationOption]) #ref doc: https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/
item0= gis.content.search([SelectQueryParameters])[0] # Selects first item returned from search outline in reference doc: https://support.esri.com/en/Technical-Article/000024383
It may be helpful to get a fuller picture of your overall workflow. Would you be able to provide a rough pseudo-code of what you are aiming to do? Otherwise, I hope the above helps!
-Marc
Hi Marc Santos,
Thanks so much for the quick reply. Currently in our workflow, biologists are using Field Maps to collect data (AGOL shared to a group/not public). Post-survey we have a few script tools they run in Pro to download local copies of that day's survey.
The desktop script tool uses GetParameterAsText for their agol name/password to have access to the data. The script also uses the ArcgIS API for Python to intersect their survey data with other AGOL layers and update table attributes prior to downloading. This process works pretty well in the office.
The attempt at the web tool was to offer an alternative option for downloading/geoprocessing the data when internet is limited (research vessel). My thought the web tool might be a good option since the processing is using the server network and not the client. Although for downloading data they might still be limited.
We also do have the Data Interoperability extension available on a VM and another alternative I was trying to figure out is for a desktop Pro user to trigger the ETL to run on demand (versus scheduled). I haven't figured how or if this is possible. The reason to trigger would be we do not have enough data interopability licenses for all biologists to use.
Appreciate any thoughts for improving the workflow, but maybe a web tool is not a good alternative.
Thanks,
Kathy