trying to run geoprocessing tool against service on portal in python...

1398
5
12-16-2019 07:55 AM
KarenEllett
Occasional Contributor

I'm not quite sure where this should go, so cross posting. We have an on premise portal with a feature service published. We're trying to write a python script that will run a geoprocessing tool against this service. Right now it's extremely simple; authenticate to portal, look at the service, and run arcpy.Statistics_analysis against it. All we want to do is get an output table of statistics. Now, if I do this in python directly against the database, no problems. But when running it against the service... it SEES the service, but it fails on the analysis. Can anyone help?? 

def run(self):
self._sign_into_portal()
#self._build_scratch_fgdb()


lines = arcpy.SearchCursor(
"https://gisgas03dv.devl.scana.com/arcgis/rest/services/DENC/Yanceyville_UN/FeatureServer/3"
)

for line in lines:
print(line.getValue("OBJECTID"))


stats_table = "C:/temp/services_test.gdb/stats_out"
# stats_table = self.config.get("scratch_fgdb_path")
# print(stats_table)
stat = [["st_length", "SUM"]]

arcpy.Statistics_analysis(lines, stats_table, stat)

0 Kudos
5 Replies
Arne_Gelfert
Occasional Contributor III

Just starting to read more about geoprocessing services myself. That's how I found this post....do you have an error/failure message? Could it be permissions? Service can't access database, or database not registered with server? Those things trip me up all the time.

0 Kudos
deleted-user-NvcfpBOWaKwr
Occasional Contributor

Can you post any error information please. We also don't have all of your code to see how you are making the connection, accessing the service and if you are doing any type of exporting. 

How are you feeding data into the the arcpy.Statistics_analysis tool? 

Usually when I need to use arcpy to run analysis on published services I export the service in a usable format (FGDB, Shapefile, CSV etc) download the export and unzip the file if its needed. You then have data you can use in arcpy.  There might be a function or tool within the ArcGIS API for Python that would allow you to do this without exporting. I didn't spend much time looking through the API so if anyone else knows of a better way than what I'm about to show please share. 

So you would need to export your service to a format that you can use. Once it's exported you then download it:

import arcgis
import os
import datetime

########################################################################################################################
# Variables
########################################################################################################################

agoLogin = arcgis.GIS(url="input the url to your portal",
                      username="input your username",
                      password="input your password")

getTheService = agoLogin.content.get("Input the item ID # here")
serviceExportName = "{}_BACKUP_{}".format(getTheService.name,
                                          datetime.datetime.now().strftime("{}_{}_{}".format("%m", "%d", "%Y")))

########################################################################################################################
# Export the service to File Geodatabase
########################################################################################################################

getTheService.export(serviceExportName, "File Geodatabase", parameters=None, wait=True)

########################################################################################################################
# Download the File Geodatabase and saves it in the same folder that the script runs in
########################################################################################################################

searchForExportedFGDB = agoLogin.content.search(query=serviceExportName)
exportedFGDBiD = searchForExportedFGDB[0].id
downloadExportedFGDB = agoLogin.content.get(exportedFGDBiD)

downloadExportedFGDB.download(save_path=os.path.dirname(os.path.abspath(__file__)))

########################################################################################################################
# Then use zipfile to unzip the geodatabase 
########################################################################################################################


‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Once the export is downloaded unzip it and use the featureclass or shapefile as the in table for the arcpy.Statistics_analysis() If you are using a CSV you will probably have to convert it to a table before you can use it.

0 Kudos
KarenEllett
Occasional Contributor

We're able to log in to our portal successfully via script, no problem there.  The error we get is once we attempt to run the stat tool, and it's the super generic "arcgisscripting.ExecuteError: Failed to execute. Parameters are not valid." We presume this is because we're inputting a service and it expects a table/feature class.  I know we could export the data and run, but we're trying to figure out how to export the service.

In your example, you're using the item ID from the portal.  That's an option, but is there any way to do this using the rest endpoint url instead, as in my code example above?  Doing it by item ID would presumable export the entire service, which is more than we need; we only want one layer (in this case, layer id 3) to be exported.

Thanks! 

0 Kudos
JamesHood
Occasional Contributor

Hi Karen.


I would guess that you are trying to run a geoprocessing tool using an input that is invalid: namely the published service which is a "REST" end point outputting into JSON format. 

 

Inputs needs to be a table view:  https://desktop.arcgis.com/en/arcmap/10.3/tools/analysis-toolbox/summary-statistics.htm 

My work around would be to connect to the rest end point, convert JSON to feature class , FC to LAYER  either in memory or on the hard drive, then run the geoprocessing tool.   

something like this:  

import urllib2, arcpy, os

myurl=r"http://server.arcgisonline.com/arcgis/rest/services/Reference/World_Transportation/MapServer/0"
JSON_OUT = r"C:\DATA\text.json"
response = urllib2.urlopen(myurl)
myJSON = response.read()
foo = open(JSON_OUT, "wb")
foo.write(myJSON)
foo.close()
arcpy.JSONToFeatures_conversion(JSON_OUT, r"C:\DATA\local_Mercator2.gdb\JSON")

https://gis.stackexchange.com/questions/244411/how-to-access-a-secured-arcgis-rest-api-using-python-... 

0 Kudos
DougBrowning
MVP Esteemed Contributor

I do not think you can just give it a URL and it goes.  I am doing something similar where I run QA from a Hosted Feature Service directly.

Well its kind of direct.  It still has to "download" the data but it goes from a json object right to a search cursor.  Mine has to poll a SAML card so it is more complicated.  (But you have not been able to hardcode a password in for some time due to OUTH2 - at least that is what Esri said when when my old script went out.)

Its hard to take out the Auth code but here is the basics.  See the bottom for a simple example.

Hope that helps.  I use it a lot.

Now this is a search cursor like you wrote in script.  But if you want to run a geoprocesing tool I am not sure.  You could try giving it the object below.  Or you may have to download a copy, unzip, feed into the tool.

If so you can use this same code but instead submit a download job.  I use this to backup AGOL every night.  See below to the next script.

import os
from uuid import uuid4
import requests
import arcpy, time, datetime
import requests
from requests_arcgis_auth import ArcGISPortalSAMLAuth 
# this is on github if you need it for SAML
import tempfile
import wincertstore


# New code from Patrick to handle the SSL scanner
pem_file = r''
with tempfile.NamedTemporaryFile(delete=False) as tf:
   for storename in ("CA", "ROOT"):
       with wincertstore.CertSystemStore(storename) as store:
           for cert in store.itercerts(usage=wincertstore.SERVER_AUTH):
               # Py v2
               try:
                   tf.write(cert.get_pem().decode("ascii"))
               # Py v3
               except:
                   tf.write(bytes(cert.get_pem(),"ascii"))
   tf.flush()
   pem_file = tf.name
   tf.close()


# Setup Parameters & Authentication
url = r'https://yourorg.maps.arcgis.com/sharing/rest'
client_id = r'okjhasdfhahl'
s = requests.session()
# New add from Patirck to handle SSL scanner
s.verify = pem_file
s.auth = ArcGISPortalSAMLAuth(client_id,verify=pem_file)
#s.auth = ArcGISPortalSAMLAuth(client_id)
# For GeoPlatform
serviceURL = r'https://www.arcgis.com/sharing/rest'

# Standard end of URL
endURL = "/query?f=json&where=1=1&outFields=*&returnGeometry=true"

# set the service item id here
# MT
#item_id_to_query = r'sdfgsdfgdgsgsdg'


# feature class numbers vary by state - set them up here
# MT
##plotsURL = "/" + "0" + endURL
##photosURL = "/" + "1" + endURL
##plotcharURL = "/" + "2" + endURL
##lpiURL = "/" + "3" + endURL
##gapURL = "/" + "4" + endURL
##specrichURL = "/" + "5" + endURL
##soilURL = "/" + "6" + endURL
##plotobURL = "/" + "7" + endURL
##gapdetailURL = "/" + "15" + endURL
##lpidetailURL = "/" + "14" + endURL
##specrichdetailURL = "/" + "12" + endURL
##unknownURL = "/" + "9" + endURL
##print "Running MT"
##useLogFile("MT", logFileDir)

# THIS is where you load into an object for the search cursor
# load in all the layers into vars
# get the base URL and token started
response = s.post(url+"/content/items/{}?f=json".format(item_id_to_query))
fs_endpoint = response.json().get('url')

# plots
q_response = s.get(fs_endpoint + plotsURL)
plotsF = arcpy.FeatureSet()
plotsF.load(q_response.url)
# photos
q_response = s.get(fs_endpoint + photosURL)
photosF = arcpy.FeatureSet()
photosF.load(q_response.url)
# and more...

plotKeys = []
# get a list of Eval PlotKeys from Plots
whereClause = "EvalStatus = 'Eval'"
with arcpy.da.SearchCursor(plotsF,"PlotKey", whereClause) as cursor:
    for row in cursor:
        plotKeys.append(row[0])

# do other stuff
# old that I left inhere in case it makes more sense without all the card stuff
# Make a request to the item & obtain the feature service endpoint
##response = s.post(url+"/content/items/{}?f=json".format(item_id_to_query))
##fs_endpoint = response.json().get('url')
##print ("Item ID {} feature service is hosted at {}".format(item_id_to_query,fs_endpoint))
##
### Query the end-point.
###   Keep in mind, this only works because the service is 'federated' with the portal.
###   This would not work for ANY service (including public/anonymous) as it will attach the token, and this token is only good for the specific portal we are working with
##q_response = s.get(fs_endpoint + "/0/query?f=json&where=1=1&outFields=*&returnGeometry=true")
##features = q_response.json().get('features')
##print ("Found {} records on layer 0 of the fs endpoint".format(len(features)))
##
### Load to a 'feature set'
###   Keep in mind, this works because the 'token' is present in the HTTP GET URL.
###   May not be supported if it needs to be a POST or the parameters end up in the body of the request
##fs = arcpy.FeatureSet()
##fs.load(q_response.url)‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍

Download part of script

 # Submit a job to export a service to FGDB (to user folder)
    print ("Submitting an AGOL job to export Feature Service data to FGDB from Item ID {}...".format(item_to_export))
    params = {}
    params['itemId'] = item_to_export
    params['title'] = export_title
    params['exportFormat'] = 'File Geodatabase'
    params['f']='json'
    response = s.post(serviceURL+"/content/users/{}/export".format(username),data=params)
    new_item_id = response.json().get('exportItemId')
    job_id = response.json().get('jobId')
    """{u'exportFormat': u'fileGeodatabase',
     u'exportItemId': u'2009ac16f2bc4dbgsdfg4f2a89f',
     u'jobId': u'5927523c-4af6-4ada-bc00-d8sdgsdgsdfd0::KbxwQRRfWyEYLgp4',
     u'ownerFolder': u'342aesdfgsdfgsdfgsd1b92beed',
     u'serviceItemId': u'54250asfdgsdfgsdfgsddd82a363cf4',
     u'size': 167141376,
     u'type': u'file'}"""
    print ("Job {} submitted.  \nNew Item ID: {}".format(job_id,new_item_id))

    # Check Status (loop until completed)
    print ("Checking the job status every {} seconds...".format(check_job_status_interval_in_sec))
    params = {}
    params['jobId'] = job_id
    params['jobType'] = 'export'
    params['f'] = 'json'
    status = "processing"
    while status=="processing":
        status_response = s.post(serviceURL+"/content/users/{}/items/{}/status".format(username,new_item_id),data=params)
        #{u'itemId': u'2009ac16dfasdfasda89f',u'status': u'processing',u'statusMessage': u'Job Status for jobId: 5dfa23c-4af6-4ada-bc00-d8asdfasdfad0'}
        status = status_response.json().get('status')
        exported_item_id = status_response.json().get('itemId')
        print (" -- Job Status: {}".format(status))
        if status == "processing":
            #print (" -- Checking status in {} seconds".format(check_job_status_interval_in_sec))
            time.sleep(check_job_status_interval_in_sec)

    print ("Final Export Status: {}".format(status))

    # Download FGDB Export
    print ("Downloading final exported FGDB...")
    download_response = s.get(serviceURL+"/content/items/{}/data?f=json".format(exported_item_id))

    # Save to local
    dt = datetime.strftime(datetime.now(),"%Y%m%d_%H%M%S")
    out_file = os.path.join(download_folder,"{}_{}.zip".format(export_title,dt))
    print ("Saving final downloaded FGDB to {}...".format(out_file))
    f = open(out_file,'wb')
    f.write(download_response.content)
    f.close()

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the exported FGDB ({}) on AGOL (cleaning up)...".format(exported_item_id))
    params = {}
    params['f'] = "json"
    params['items'] = exported_item_id
    delete_response = s.post(serviceURL+r'/content/users/{}/deleteItems'.format(username),data=params)
    print ("Delete result: {}".format(delete_response.content))
0 Kudos