I want to take a published Web Map ID (GUID datatype) and be able to copy to a Feature Class certain feature layers/feature sets in the underlying data, run geo-processing on those copies, then update/add data back to those underlying featureclasses in the Web Map.
I have gotten as far as
--Identifying the Web Map using the supplied GUID
--Identifying the various feature services/web feature layers that make up the Web Map (3 are used, each containing different feature classes)
--then drilling down into those feature services to find and set to a variable the desired feature set (the point or a line feature I need to process). 'Feature Set' is the only object type I have been able to create from a Feature Service.
I now need to copy that feature set to a feature class (into either a desktop gdb, an enterprise database, or to memory) so that I can run my geo-processing arcpy script.
The script is working, as long as I have hard coded database connections, and runs various geo-processing functions (Near_analysis, CopyFeatures, Snap_edit, SplitLineAtPoint, AddGeometryAttributes, DeleteIdentical, along with several SearchCursors and UpdateCursors) with the result of an updated point and line feature class that I need to then update/add back to the feature classes used in the original Web Map.
The issue is I do not know where the original data may be at, the customer will be generating the Web Maps and so I cannot hardcode the database connections. I only have a GUID to the Web Map the data needs to come from and go back to. That is my only connection to the desired data.
Solved! Go to Solution.
I was finally able to get this to work, by taking the WebMap guid ID supplied, and:
This then allowed me to run my geo-processing tools, and report my findings. The only thing I have yet to figure out is how to push my updates/additions back to the underlying feature classes in the feature services...
Here is the code that worked for me (with some data/code redacted/changed-sorry, couldn't get the editor to show the code very well):
import arcpyfrom arcpy import envimport timeitimport uuidfrom arcgis.gis import GISfrom arcgis import featuresfrom arcgis.mapping import WebMapfrom collections import OrderedDictimport jsonimport urllibimport http.client start_time = timeit.default_timer()#Constant valuesSURVEY_DB_WORKSPACE = r'C:\Users\survey_gdb.sde'#survey table server instance and owner prefix valuesSURVEY_DB_INSTANCE_AND_OWNER = 'mydb.dbo.'GEO_PROC_WORKSPACE = r"C:\Users\geo_proc_gdb.sde"#geo-proc server instance and owner prefix valuesGEO_PROC_DB_INSTANCE_AND_OWNER = 'mydb_gdb.DBO.'#rest services credentialsUSERNAME = 'user'PASSWORD = 'password'PORTAL_URL = 'https://mywebsite/portal'SERVERNAME = "mywebsite"PORT = ''#if using web adaptor, leave empty (modify tokenURL)TOKEN_URL = "https://mywebsite/server/admin/generateToken"def main(): #set environment to the Survey database, loop through surveys table, grab WebMap GUID workspace = SURVEY_DB_WORKSPACE arcpy.env.workspace = workspace arcpy.env.overwriteOutput = True surveys = SURVEY_DB_INSTANCE_AND_OWNER + 'Surveys' where_clause = "active = 1" fields = ['OID@', 'Name', 'Id', 'Progress'] web_map_ids = OrderedDict() with arcpy.da.SearchCursor(surveys, fields) as cursor: for row in cursor: web_map_ids[row[0]] = [row[1], row[2], row[3]] #change environment to the geo-processing database workspace = GEO_PROC_WORKSPACE #connect to db and set point and line features field = 'state' lines_backup = GEO_PROC_DB_INSTANCE_AND_OWNER + 'lines_backup' points_backup = GEO_PROC_DB_INSTANCE_AND_OWNER + 'points_backup' tmp_lines_split = GEO_PROC_DB_INSTANCE_AND_OWNER + 'tmp_lines_split' lines_split = GEO_PROC_DB_INSTANCE_AND_OWNER + 'lines_split' arcpy.env.workspace = workspace arcpy.env.overwriteOutput = True #Define how to acquire a rest services token #from https://community.esri.com/thread/83654 def getToken(username, password, serverName, serverPort): params = urllib.parse.urlencode({'username': USERNAME, 'password': PASSWORD,'client': 'requestip', 'f': 'json'}) headers = {"Content-type": "application/x-www-form-urlencoded", "Accept": "text/plain"} # Connect to URL and post parameters httpConn = http.client.HTTPSConnection(serverName, serverPort) httpConn.request("POST", TOKEN_URL, params, headers) # Read response response = httpConn.getresponse() print(response.status) if (response.status != 200): httpConn.close() print("Error while fetching tokens from admin URL. Please check the URL and try again.") return else: data = response.read() httpConn.close() # Check that data returned is not an error object if not assertJsonSuccess(data): return # Extract the token from it token = json.loads(data) return token['token'] ###End getToken function# A function that checks that the input JSON object is not an error object. def assertJsonSuccess(data): obj = json.loads(data) if 'status' in obj and obj['status'] == "error": print("Error: JSON object returns an error. " + str(obj)) return False else: return True ### End jsonSuccess function """ This is the Main loop through WebMap values, processing and reporting back to survey, then continuing """ #loop through the survey web map guids for processing-this is where the GUID values come from current_survey = '' for k,v in web_map_ids.items(): if v[1] is None: continue current_survey = v[0] #get WebMap gis = GIS(PORTAL_URL, username=USERNAME, password=PASSWORD) web_map = gis.content.get(v[1]) web_Map = WebMap(web_map) web_map_layers = web_Map.layers #extract the feature service name that contains the Point feature class num = 0 for item in web_map_layers: for tag in item: if tag =='title': if item[tag] == 'Point': num = web_map_layers.index(item) else: continue else: continue point_url = web_map_layers[num]['url'] #from https://www.esri.com/arcgis-blog/products/arcgis-desktop/analytics/quick-tips-consuming-feature-services-with-geoprocessing/ #Extract Points from feature service to feature class baseURL = point_url where = '1=1' fields = '*' token = 'getToken(USERNAME, PASSWORD, SERVERNAME, PORT)' query = "?where={}&outFields={}&returnGeometry=true&f=json&token={}".format(where, fields, token) # See http://services1.arcgis.com/help/index.html?fsQuery.html for more info on FS-Query fsURL = baseURL + query fs = arcpy.FeatureSet() fs.load(fsURL) #clean up prior to re-creating backups if arcpy.Exists(lines_backup): arcpy.Delete_management(lines_backup) if arcpy.Exists(points_backup): arcpy.Delete_management(points_backup) #copy Points from feature service to feature class, for geo-processing arcpy.CopyFeatures_management(fs, points_backup)
I was finally able to get this to work, by taking the WebMap guid ID supplied, and:
This then allowed me to run my geo-processing tools, and report my findings. The only thing I have yet to figure out is how to push my updates/additions back to the underlying feature classes in the feature services...
Here is the code that worked for me (with some data/code redacted/changed-sorry, couldn't get the editor to show the code very well):
import arcpyfrom arcpy import envimport timeitimport uuidfrom arcgis.gis import GISfrom arcgis import featuresfrom arcgis.mapping import WebMapfrom collections import OrderedDictimport jsonimport urllibimport http.client start_time = timeit.default_timer()#Constant valuesSURVEY_DB_WORKSPACE = r'C:\Users\survey_gdb.sde'#survey table server instance and owner prefix valuesSURVEY_DB_INSTANCE_AND_OWNER = 'mydb.dbo.'GEO_PROC_WORKSPACE = r"C:\Users\geo_proc_gdb.sde"#geo-proc server instance and owner prefix valuesGEO_PROC_DB_INSTANCE_AND_OWNER = 'mydb_gdb.DBO.'#rest services credentialsUSERNAME = 'user'PASSWORD = 'password'PORTAL_URL = 'https://mywebsite/portal'SERVERNAME = "mywebsite"PORT = ''#if using web adaptor, leave empty (modify tokenURL)TOKEN_URL = "https://mywebsite/server/admin/generateToken"def main(): #set environment to the Survey database, loop through surveys table, grab WebMap GUID workspace = SURVEY_DB_WORKSPACE arcpy.env.workspace = workspace arcpy.env.overwriteOutput = True surveys = SURVEY_DB_INSTANCE_AND_OWNER + 'Surveys' where_clause = "active = 1" fields = ['OID@', 'Name', 'Id', 'Progress'] web_map_ids = OrderedDict() with arcpy.da.SearchCursor(surveys, fields) as cursor: for row in cursor: web_map_ids[row[0]] = [row[1], row[2], row[3]] #change environment to the geo-processing database workspace = GEO_PROC_WORKSPACE #connect to db and set point and line features field = 'state' lines_backup = GEO_PROC_DB_INSTANCE_AND_OWNER + 'lines_backup' points_backup = GEO_PROC_DB_INSTANCE_AND_OWNER + 'points_backup' tmp_lines_split = GEO_PROC_DB_INSTANCE_AND_OWNER + 'tmp_lines_split' lines_split = GEO_PROC_DB_INSTANCE_AND_OWNER + 'lines_split' arcpy.env.workspace = workspace arcpy.env.overwriteOutput = True #Define how to acquire a rest services token #from https://community.esri.com/thread/83654 def getToken(username, password, serverName, serverPort): params = urllib.parse.urlencode({'username': USERNAME, 'password': PASSWORD,'client': 'requestip', 'f': 'json'}) headers = {"Content-type": "application/x-www-form-urlencoded", "Accept": "text/plain"} # Connect to URL and post parameters httpConn = http.client.HTTPSConnection(serverName, serverPort) httpConn.request("POST", TOKEN_URL, params, headers) # Read response response = httpConn.getresponse() print(response.status) if (response.status != 200): httpConn.close() print("Error while fetching tokens from admin URL. Please check the URL and try again.") return else: data = response.read() httpConn.close() # Check that data returned is not an error object if not assertJsonSuccess(data): return # Extract the token from it token = json.loads(data) return token['token'] ###End getToken function# A function that checks that the input JSON object is not an error object. def assertJsonSuccess(data): obj = json.loads(data) if 'status' in obj and obj['status'] == "error": print("Error: JSON object returns an error. " + str(obj)) return False else: return True ### End jsonSuccess function """ This is the Main loop through WebMap values, processing and reporting back to survey, then continuing """ #loop through the survey web map guids for processing-this is where the GUID values come from current_survey = '' for k,v in web_map_ids.items(): if v[1] is None: continue current_survey = v[0] #get WebMap gis = GIS(PORTAL_URL, username=USERNAME, password=PASSWORD) web_map = gis.content.get(v[1]) web_Map = WebMap(web_map) web_map_layers = web_Map.layers #extract the feature service name that contains the Point feature class num = 0 for item in web_map_layers: for tag in item: if tag =='title': if item[tag] == 'Point': num = web_map_layers.index(item) else: continue else: continue point_url = web_map_layers[num]['url'] #from https://www.esri.com/arcgis-blog/products/arcgis-desktop/analytics/quick-tips-consuming-feature-services-with-geoprocessing/ #Extract Points from feature service to feature class baseURL = point_url where = '1=1' fields = '*' token = 'getToken(USERNAME, PASSWORD, SERVERNAME, PORT)' query = "?where={}&outFields={}&returnGeometry=true&f=json&token={}".format(where, fields, token) # See http://services1.arcgis.com/help/index.html?fsQuery.html for more info on FS-Query fsURL = baseURL + query fs = arcpy.FeatureSet() fs.load(fsURL) #clean up prior to re-creating backups if arcpy.Exists(lines_backup): arcpy.Delete_management(lines_backup) if arcpy.Exists(points_backup): arcpy.Delete_management(points_backup) #copy Points from feature service to feature class, for geo-processing arcpy.CopyFeatures_management(fs, points_backup)