Is there any way to have survey responses automatically update a field in a separate layer? For example, I have a layer that consists of all the bee yards in a particular region (point layer). The layer has multiple attributes, but the one I need to update as survey responses come in is the colony count (numerical).
When the beekeepers travel to each yard, they use survey 123 to report what activities they performed while there. Two of the questions ask if they removed or added any colonies. If they removed, the quantity they removed needs to be removed from the colony count for that yard, if they added it needs to be added, etc. Is there a way to accomplish this yard inventory process without doing the additions and subtractions manually, perhaps in the form of a python script or other?
Solved! Go to Solution.
I don't have anything public that I can share right this second, but this is the closest I can get you. In this use case, there is an API being queried and writing to an REST Service. You could just swap out the source API for your data source and pretty much everything else would fall into alignment. FYI, I am not using the easiest of methods which would be to use all of the libraries AGOL offers in Notebooks. This script was a quick demo showing we could lift from our batch server and insert into Portal/AGOL Notebooks and have things run without too much modification (in some cases).
# Configure for DB or Feature Service
useDB = 0 #Binary off / on# Data Source
url = 'https://www.publicstuff.com/api/2.0/requests_list?return_type=json'# API Key
api_key = ''# Max Pull per Record Type
max_pull = 1000 # Set to 'Max' if you don't want to limit it.# Retrieve Request Type (format = ID Type being gathered, Title, time increment, time increment type (days, hours, minutes, seconds), E-mail Active binary, E-mails to,
# TTL Time Increment, TTL Time type (day, hour, minute, second), purge archived binary)
request_type = [['1017049', 'Illegal Gatherings', 1, 'days', 0, 'test@bellevuewa.gov', 2, 'days', 0], ['1015636', 'Illegal Fireworks', 1, 'days', 0, 'test@bellevuewa.gov', 1, 'days', 0]]# Configure the e-mail server and other info here.
mail_server = 'test.bellevuewa.gov'
mail_from = 'MyBelleuve App<noreply@bellevuewa.gov>'
onfail_mail_to = 'test@bellevuewa.gov'# Targeted Service & layer for Data
service_URL = 'https://services1.arcgis.com/EYzEZbDhXZjURPbP/arcgis/rest/services/My_Bellevue_Apps/FeatureServer/0/'# ArcGIS Online Credentials
AGOL_User = 'gisdba_cobgis'
AGOL_Pass = 'Put the Password Here'# ------------------------------------------------------------------------------
# DO NOT UPDATE BELOW THIS LINE OR RISK DOOM AND DISPAIR! Have a nice day!
# ------------------------------------------------------------------------------# Import Python libraries
import urllib
import time
import datetime
import re
import requests, json, collections, string
import smtplib#-------------------------------------------------------------------------------
#
#
# Functions
#
#
#-------------------------------------------------------------------------------def gatherdata(typeID, typeTitle, typeTimeIncrement, typeTimeIncrementType):
pull_window = determineTime(typeTimeIncrement, typeTimeIncrementType)
payload = retrieveData(typeID, pull_window)return (payload)
def determineTime(typeTimeIncrement, typeTimeIncrementType):currentDateTime = int(time.time()) #Unix Time
if typeTimeIncrementType.upper() == 'SECONDS':
increment = typeTimeIncrement
elif typeTimeIncrementType.upper() == 'MINUTES':
increment = typeTimeIncrement * 60
elif typeTimeIncrementType.upper() == 'HOURS':
increment = typeTimeIncrement * 3600
elif typeTimeIncrementType.upper() == 'DAYS':
increment = typeTimeIncrement * 86400print ('Time Window: {} seconds'.format(increment))
pull_window = str(currentDateTime - increment)
print ('Unix Time Calc: {}'.format(pull_window))
return (pull_window)
def retrieveData(typeID, pull_window):
pull_url = url + '&request_type_id={}'.format(typeID)
if max_pull > 0:
pull_url = pull_url + '&limit={}'.format(max_pull)if int(pull_window) > 0:
pull_url = pull_url + '&after_timestamp={}'.format(pull_window)
pull_url = pull_url + '&api_key={}'.format(api_key)status_response = requests.get (pull_url)
status_data = status_response.json()return (status_data)
def send_new(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses):mail_priority = '3'
mail_subject = '{} Reported'.format(typeTitle.replace('\'', ''))
mail_msg = ('An {} has been reported. Details are as follows:\n\n' +
'Report ID: {}\n'+
' Title: {}\n'+
' Description: {}\n'+
' Reported On: {}\n'+
' Address: {}, {}\n'+
' Location: {}\n'+
' Maplink: https://www.google.com/maps/search/?api=1&query={},{}\n'+
'\n[SYSTEM GENERATED MESSAGE]').format(typeTitle.replace('\'', ''), id.replace('\'', ''), title.replace('\'', ''), description.replace('\'', '')
, date_created.replace('\'', ''), address.replace('\'', ''), zipcode.replace('\'', ''), location.replace('\'', '')
, latitude.replace('\'', ''), longitude.replace('\'', ''))email_target = '{}'.format(typeEmailAddresses)
send_mail = ('To: {}\n'+
'From: {}\n'+
'X-Priority: {}\n'+
'Subject: {}\n\n'+
'{}').format(email_target, mail_from, mail_priority, mail_subject, mail_msg)print (' Sending message to {} recipients.\n'.format(email_target))
email_target_final = email_target
server = smtplib.SMTP(mail_server)
server.sendmail(mail_from, email_target_final, send_mail)
server.quit()return
def send_error(error_status):
mail_priority = '1'
mail_subject = 'Script Error: My Bellevue Data Pull'
mail_msg = ('An error has been reported with a recent data pull. Info as follows:\n\n' +
'Error: {}\n\n'+
'Please check this script ASAP!\n'+
'\n[SYSTEM GENERATED MESSAGE]').format(error_status)email_target = '{}'.format(onfail_mail_to)
send_mail = ('To: {}\n'+
'From: {}\n'+
'X-Priority: {}\n'+
'Subject: {}\n\n'+
'{}').format(email_target, mail_from, mail_priority, mail_subject, mail_msg)print (' Sending message to {} recipients.\n'.format(email_target))
email_target_final = email_target
server = smtplib.SMTP(mail_server)
server.sendmail(mail_from, email_target_final, send_mail)
server.quit()return
def get_token():
url = 'https://www.arcgis.com/sharing/rest/generateToken'
values = {'f': 'json',
'username': AGOL_User,
'password': AGOL_Pass,
'referer' : 'https://www.arcgis.com',
'expiration' : '10'}data = urllib.parse.urlencode(values).encode("utf-8")
req = urllib.request.Request(url)response = None
while response is None:
try:
response = urllib.request.urlopen(req,data=data)
except:
pass
the_page = response.read()#Garbage Collection with some house building
payload_json = the_page.decode('utf8')
payload_json = json.loads(payload_json)edit_token = payload_json['token']
return (edit_token)
def pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive):# Get token for editing/checking
edit_token = get_token()# Get count
item_count, objectID = queryCount(id,edit_token)if item_count > 0:
upload_payload = [{
'geometry' : {'x' : longitude, 'y' : latitude},
'attributes' : {
'OBJECTID': objectID,
'DateCreate' : date_created,
'RequestType' : typeTitle,
'Id' : id,
'Status' : status,
'Title' : title,
'Description' : description,
'Address' : address,
'ZipCode' : zipcode,
'Latitude' : latitude,
'Longitude' : longitude,
'Rank' : rank,
'Submitter' : user,
'Location' : location,
'UserComments' : user_comments,
'UserFollows' : user_follows,
'UserRequest' : user_request,
'CommentCount' : comment_count,
'Followers' : followers,
'Supporters' : supporters
}
}]
update_AGOL(upload_payload, edit_token)
else:
upload_payload = [{
'geometry' : {'x' : longitude, 'y' : latitude},
'attributes' : {
'DateCreate' : date_created,
'RequestType' : typeTitle,
'Id' : id,
'Status' : status,
'Title' : title,
'Description' : description,
'Address' : address,
'ZipCode' : zipcode,
'Latitude' : latitude,
'Longitude' : longitude,
'Rank' : rank,
'Submitter' : user,
'Location' : location,
'UserComments' : user_comments,
'UserFollows' : user_follows,
'UserRequest' : user_request,
'CommentCount' : comment_count,
'Followers' : followers,
'Supporters' : supporters
}
}]
insert_AGOL(upload_payload, edit_token)return
def insert_AGOL(upload_payload, edit_token):
edit_token = get_token()
FS_service = service_URL + 'addFeatures/?token={}'.format(edit_token)data = urllib.parse.urlencode({'f': 'json', 'features': upload_payload}).encode('utf-8')
response = None
while response is None:
try:
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
except:
pass
#req = urllib.request.Request(FS_service)
#response = urllib.request.urlopen(req,data=data)
the_page = response.read()
the_json = response.info().get_content_charset('utf-8')
parsethis = json.loads(the_page.decode(the_json))for item in parsethis['addResults']:
if item['success'] == False:
error_status = '{}\n\nPayload update sent: {}'.format(parsethis, upload_payload)
send_error(error_status)
print (' FAILURE!!! Point Record Add Failed.')
else:
print (' Point Record Added.')return
def update_AGOL(upload_payload, edit_token):
edit_token = get_token()
FS_service = service_URL + 'updateFeatures/?token={}'.format(edit_token)
data = urllib.parse.urlencode({'f': 'json', 'features': upload_payload}).encode('utf-8')
#print (upload_payload)
response = None
while response is None:
try:
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
except:
pass
#req = urllib.request.Request(FS_service)
#response = urllib.request.urlopen(req,data=data)
the_page = response.read()
the_json = response.info().get_content_charset('utf-8')
parsethis = json.loads(the_page.decode(the_json))#print (parsethis)
for item in parsethis['updateResults']:
if item['success'] == False:
error_status = '{}\n\nPayload update sent: {}'.format(parsethis, upload_payload)
send_error(error_status)
print (' FAILURE!!! Point Record Update Failed.')
else:
print (' Point Record Updated.')return
def queryCount(id,edit_token):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'Id=\'{}\''.format(id)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'Id=\'{}\''.format(id)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'outFields':'OBJECTID'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
for oid_item in response_payload['features']:
objectID = oid_item['attributes']['OBJECTID']
except:
pass
else:
objectID = '0'print ('MyBellevue App ID: {}'.format(id))
return (item_count, objectID)
def queryCount_Cleaning(TTL_search, typeTitle):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'EditDate<\'{}\' and RequestType=\'{}\' and Status=\'completed\''.format(TTL_search, typeTitle)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'EditDate<\'{}\' and RequestType=\'{}\' and Status=\'completed\''.format(TTL_search, typeTitle)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'outFields':'OBJECTID'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
for oid_item in response_payload['features']:
objectID = oid_item['attributes']['OBJECTID']
upload_payload = [{
'attributes' : {
'OBJECTID': objectID,
'Status' : 'archived'
}
}]
update_AGOL(upload_payload, edit_token)
except:
print ('Failed to clean entirely')else:
print ('No cleaning required.')return
def queryCount_Purge(TTL_search, typeTitle):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'EditDate < \'{}\' and RequestType = \'{}\' and Status in (\'completed\', \'archived\')'.format(TTL_search, typeTitle)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'EditDate < \'{}\' and RequestType = \'{}\' and Status in (\'completed\', \'archived\')'.format(TTL_search, typeTitle)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement}).encode('utf-8')
FS_service = service_URL + 'deleteFeatures/?token={}'.format(edit_token)
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
except:
print ('Failed to purge entirely')else:
print ('No purging required.')return
def clean_up(typeTitle, typeTTLTimeIncrement, typeTTLTimeIncrementType, typePurge):
currently_now = datetime.datetime.now()
if typeTTLTimeIncrementType.upper() == 'SECONDS':
typeTTLTimeIncrementType = 'SECOND'
TTL_search = currently_now - datetime.timedelta(seconds = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'MINUTES':
typeTTLTimeIncrementType = 'MINUTE'
TTL_search = currently_now - datetime.timedelta(minutes = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'HOURS':
typeTTLTimeIncrementType = 'HOUR'
TTL_search = currently_now - datetime.timedelta(hours = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'DAYS':
typeTTLTimeIncrementType = 'DAY'
TTL_search = currently_now - datetime.timedelta(days = int(typeTTLTimeIncrement))if typePurge == 0:
print ('Archiving....')
TTL_search = TTL_search.strftime("%Y-%m-%d %H:%M:%S")
print (TTL_search)queryCount_Cleaning(TTL_search, typeTitle)
else:
print ('Purging....')
TTL_search = TTL_search.strftime("%Y-%m-%d %H:%M:%S")
print (TTL_search)queryCount_Purge(TTL_search, typeTitle)
return
#-------------------------------------------------------------------------------
#
#
# MAIN SCRIPT
#
#
#-------------------------------------------------------------------------------update_target = []
complete_target = []special = r'\u'
for request in request_type:
typeID = request[0]
typeTitle = request[1]
typeTimeIncrement = request[2]
typeTimeIncrementType = request[3]
typeEmailActive = request[4]
typeEmailAddresses = request[5]
typeTTLTimeIncrement = request[6]
typeTTLTimeIncrementType = request[7]
typePurge = request[8]payload = gatherdata (typeID, typeTitle, typeTimeIncrement, typeTimeIncrementType)
typeTitle = '{}'.format(typeTitle)
typeTTLTimeIncrement = '{}'.format(typeTTLTimeIncrement)
typeTTLTimeIncrementType = '{}'.format(typeTTLTimeIncrementType)
for item in payload['response']['requests']:
date_created = (datetime.datetime.utcfromtimestamp(int(item['request']['date_created'])).strftime('%Y-%m-%d %H:%M:%S'))
date_created = '{}'.format(date_created)
id = item['request']['id']
id = '{}'.format(id)
status = item['request']['status']
status = '{}'.format(status)
title = '' if '{}'.format(item['request']['title']) == '' else '{}'.format(item['request']['title'])
description = item['request']['description']
description = '' if description == None else description.encode('ascii','ignore')
try:
description = description.replace('\'','')
except:
description = description.decode('utf8')
description = description.replace('\'','')
description = '' if '{}'.format(description) == '' else '{}'.format(description)
address = item['request']['address']
address = '' if address == None else address.encode('ascii','ignore')
try:
address = address.replace('\'','')
except:
address = address.decode('utf8')
address = address.replace('\'','')
address = '' if '{}'.format(address) == '' else '{}'.format(address)
zipcode = 0 if '{}'.format(item['request']['zipcode']) == '' else '{}'.format(item['request']['zipcode'])
latitude = 0 if '{}'.format(item['request']['lat']) == 'None' else '{}'.format(item['request']['lat'])
longitude = 0 if '{}'.format(item['request']['lon']) == 'None' else '{}'.format(item['request']['lon'])
# Other details that may or may not be neededrank = 0 if '{}'.format(item['request']['rank']) == '' else '{}'.format(item['request']['rank'])
user = '' if '{}'.format(item['request']['user']) == '' else '{}'.format(item['request']['user'])
location = item['request']['location']
location = '' if location == None else location.encode('ascii','ignore')
try:
location = location.replace('\'','')
except:
location = location.decode('utf8')
location = location.replace('\'','')
location = '' if '{}'.format(location) == '' else '{}'.format(location)
user_comments = 0 if '{}'.format(item['request']['user_comments']) == '' else '{}'.format(item['request']['user_comments'])
user_follows = 0 if '{}'.format(item['request']['user_follows']) == '' else '{}'.format(item['request']['user_follows'])
user_request = 0 if '{}'.format(item['request']['user_request']) == '' else '{}'.format(item['request']['user_request'])
comment_count = 0 if '{}'.format(item['request']['count_comments']) == '' else '{}'.format(item['request']['count_comments'])followers = 0 if '{}'.format(item['request']['count_followers']) == '' else '{}'.format(item['request']['count_followers'])
supporters = 0 if '{}'.format(item['request']['count_supporters']) == '' else '{}'.format(item['request']['count_supporters'])image_thumbnail = 0 if '{}'.format(item['request']['image_thumbnail']) == '' else '\'{}\''.format(item['request']['image_thumbnail'])
pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive)#try:
# pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
# user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive)
#except Exception as pushToAGOL_error:
# error_status = '{}'.format(pushToAGOL_error.args[0])
# send_error(error_status)
clean_up(typeTitle, typeTTLTimeIncrement, typeTTLTimeIncrementType, typePurge)
You could use a notebook to read, transform, and write to another later.
Hi John, thank you for the response. Do you have a sample you could share, of what that script would look like? Anything I can follow along - I am fairly new to ArcGIS notebooks and python in general so anything that gives me direction would be amazing.
I don't have anything public that I can share right this second, but this is the closest I can get you. In this use case, there is an API being queried and writing to an REST Service. You could just swap out the source API for your data source and pretty much everything else would fall into alignment. FYI, I am not using the easiest of methods which would be to use all of the libraries AGOL offers in Notebooks. This script was a quick demo showing we could lift from our batch server and insert into Portal/AGOL Notebooks and have things run without too much modification (in some cases).
# Configure for DB or Feature Service
useDB = 0 #Binary off / on# Data Source
url = 'https://www.publicstuff.com/api/2.0/requests_list?return_type=json'# API Key
api_key = ''# Max Pull per Record Type
max_pull = 1000 # Set to 'Max' if you don't want to limit it.# Retrieve Request Type (format = ID Type being gathered, Title, time increment, time increment type (days, hours, minutes, seconds), E-mail Active binary, E-mails to,
# TTL Time Increment, TTL Time type (day, hour, minute, second), purge archived binary)
request_type = [['1017049', 'Illegal Gatherings', 1, 'days', 0, 'test@bellevuewa.gov', 2, 'days', 0], ['1015636', 'Illegal Fireworks', 1, 'days', 0, 'test@bellevuewa.gov', 1, 'days', 0]]# Configure the e-mail server and other info here.
mail_server = 'test.bellevuewa.gov'
mail_from = 'MyBelleuve App<noreply@bellevuewa.gov>'
onfail_mail_to = 'test@bellevuewa.gov'# Targeted Service & layer for Data
service_URL = 'https://services1.arcgis.com/EYzEZbDhXZjURPbP/arcgis/rest/services/My_Bellevue_Apps/FeatureServer/0/'# ArcGIS Online Credentials
AGOL_User = 'gisdba_cobgis'
AGOL_Pass = 'Put the Password Here'# ------------------------------------------------------------------------------
# DO NOT UPDATE BELOW THIS LINE OR RISK DOOM AND DISPAIR! Have a nice day!
# ------------------------------------------------------------------------------# Import Python libraries
import urllib
import time
import datetime
import re
import requests, json, collections, string
import smtplib#-------------------------------------------------------------------------------
#
#
# Functions
#
#
#-------------------------------------------------------------------------------def gatherdata(typeID, typeTitle, typeTimeIncrement, typeTimeIncrementType):
pull_window = determineTime(typeTimeIncrement, typeTimeIncrementType)
payload = retrieveData(typeID, pull_window)return (payload)
def determineTime(typeTimeIncrement, typeTimeIncrementType):currentDateTime = int(time.time()) #Unix Time
if typeTimeIncrementType.upper() == 'SECONDS':
increment = typeTimeIncrement
elif typeTimeIncrementType.upper() == 'MINUTES':
increment = typeTimeIncrement * 60
elif typeTimeIncrementType.upper() == 'HOURS':
increment = typeTimeIncrement * 3600
elif typeTimeIncrementType.upper() == 'DAYS':
increment = typeTimeIncrement * 86400print ('Time Window: {} seconds'.format(increment))
pull_window = str(currentDateTime - increment)
print ('Unix Time Calc: {}'.format(pull_window))
return (pull_window)
def retrieveData(typeID, pull_window):
pull_url = url + '&request_type_id={}'.format(typeID)
if max_pull > 0:
pull_url = pull_url + '&limit={}'.format(max_pull)if int(pull_window) > 0:
pull_url = pull_url + '&after_timestamp={}'.format(pull_window)
pull_url = pull_url + '&api_key={}'.format(api_key)status_response = requests.get (pull_url)
status_data = status_response.json()return (status_data)
def send_new(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses):mail_priority = '3'
mail_subject = '{} Reported'.format(typeTitle.replace('\'', ''))
mail_msg = ('An {} has been reported. Details are as follows:\n\n' +
'Report ID: {}\n'+
' Title: {}\n'+
' Description: {}\n'+
' Reported On: {}\n'+
' Address: {}, {}\n'+
' Location: {}\n'+
' Maplink: https://www.google.com/maps/search/?api=1&query={},{}\n'+
'\n[SYSTEM GENERATED MESSAGE]').format(typeTitle.replace('\'', ''), id.replace('\'', ''), title.replace('\'', ''), description.replace('\'', '')
, date_created.replace('\'', ''), address.replace('\'', ''), zipcode.replace('\'', ''), location.replace('\'', '')
, latitude.replace('\'', ''), longitude.replace('\'', ''))email_target = '{}'.format(typeEmailAddresses)
send_mail = ('To: {}\n'+
'From: {}\n'+
'X-Priority: {}\n'+
'Subject: {}\n\n'+
'{}').format(email_target, mail_from, mail_priority, mail_subject, mail_msg)print (' Sending message to {} recipients.\n'.format(email_target))
email_target_final = email_target
server = smtplib.SMTP(mail_server)
server.sendmail(mail_from, email_target_final, send_mail)
server.quit()return
def send_error(error_status):
mail_priority = '1'
mail_subject = 'Script Error: My Bellevue Data Pull'
mail_msg = ('An error has been reported with a recent data pull. Info as follows:\n\n' +
'Error: {}\n\n'+
'Please check this script ASAP!\n'+
'\n[SYSTEM GENERATED MESSAGE]').format(error_status)email_target = '{}'.format(onfail_mail_to)
send_mail = ('To: {}\n'+
'From: {}\n'+
'X-Priority: {}\n'+
'Subject: {}\n\n'+
'{}').format(email_target, mail_from, mail_priority, mail_subject, mail_msg)print (' Sending message to {} recipients.\n'.format(email_target))
email_target_final = email_target
server = smtplib.SMTP(mail_server)
server.sendmail(mail_from, email_target_final, send_mail)
server.quit()return
def get_token():
url = 'https://www.arcgis.com/sharing/rest/generateToken'
values = {'f': 'json',
'username': AGOL_User,
'password': AGOL_Pass,
'referer' : 'https://www.arcgis.com',
'expiration' : '10'}data = urllib.parse.urlencode(values).encode("utf-8")
req = urllib.request.Request(url)response = None
while response is None:
try:
response = urllib.request.urlopen(req,data=data)
except:
pass
the_page = response.read()#Garbage Collection with some house building
payload_json = the_page.decode('utf8')
payload_json = json.loads(payload_json)edit_token = payload_json['token']
return (edit_token)
def pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive):# Get token for editing/checking
edit_token = get_token()# Get count
item_count, objectID = queryCount(id,edit_token)if item_count > 0:
upload_payload = [{
'geometry' : {'x' : longitude, 'y' : latitude},
'attributes' : {
'OBJECTID': objectID,
'DateCreate' : date_created,
'RequestType' : typeTitle,
'Id' : id,
'Status' : status,
'Title' : title,
'Description' : description,
'Address' : address,
'ZipCode' : zipcode,
'Latitude' : latitude,
'Longitude' : longitude,
'Rank' : rank,
'Submitter' : user,
'Location' : location,
'UserComments' : user_comments,
'UserFollows' : user_follows,
'UserRequest' : user_request,
'CommentCount' : comment_count,
'Followers' : followers,
'Supporters' : supporters
}
}]
update_AGOL(upload_payload, edit_token)
else:
upload_payload = [{
'geometry' : {'x' : longitude, 'y' : latitude},
'attributes' : {
'DateCreate' : date_created,
'RequestType' : typeTitle,
'Id' : id,
'Status' : status,
'Title' : title,
'Description' : description,
'Address' : address,
'ZipCode' : zipcode,
'Latitude' : latitude,
'Longitude' : longitude,
'Rank' : rank,
'Submitter' : user,
'Location' : location,
'UserComments' : user_comments,
'UserFollows' : user_follows,
'UserRequest' : user_request,
'CommentCount' : comment_count,
'Followers' : followers,
'Supporters' : supporters
}
}]
insert_AGOL(upload_payload, edit_token)return
def insert_AGOL(upload_payload, edit_token):
edit_token = get_token()
FS_service = service_URL + 'addFeatures/?token={}'.format(edit_token)data = urllib.parse.urlencode({'f': 'json', 'features': upload_payload}).encode('utf-8')
response = None
while response is None:
try:
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
except:
pass
#req = urllib.request.Request(FS_service)
#response = urllib.request.urlopen(req,data=data)
the_page = response.read()
the_json = response.info().get_content_charset('utf-8')
parsethis = json.loads(the_page.decode(the_json))for item in parsethis['addResults']:
if item['success'] == False:
error_status = '{}\n\nPayload update sent: {}'.format(parsethis, upload_payload)
send_error(error_status)
print (' FAILURE!!! Point Record Add Failed.')
else:
print (' Point Record Added.')return
def update_AGOL(upload_payload, edit_token):
edit_token = get_token()
FS_service = service_URL + 'updateFeatures/?token={}'.format(edit_token)
data = urllib.parse.urlencode({'f': 'json', 'features': upload_payload}).encode('utf-8')
#print (upload_payload)
response = None
while response is None:
try:
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
except:
pass
#req = urllib.request.Request(FS_service)
#response = urllib.request.urlopen(req,data=data)
the_page = response.read()
the_json = response.info().get_content_charset('utf-8')
parsethis = json.loads(the_page.decode(the_json))#print (parsethis)
for item in parsethis['updateResults']:
if item['success'] == False:
error_status = '{}\n\nPayload update sent: {}'.format(parsethis, upload_payload)
send_error(error_status)
print (' FAILURE!!! Point Record Update Failed.')
else:
print (' Point Record Updated.')return
def queryCount(id,edit_token):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'Id=\'{}\''.format(id)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'Id=\'{}\''.format(id)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'outFields':'OBJECTID'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
for oid_item in response_payload['features']:
objectID = oid_item['attributes']['OBJECTID']
except:
pass
else:
objectID = '0'print ('MyBellevue App ID: {}'.format(id))
return (item_count, objectID)
def queryCount_Cleaning(TTL_search, typeTitle):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'EditDate<\'{}\' and RequestType=\'{}\' and Status=\'completed\''.format(TTL_search, typeTitle)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'EditDate<\'{}\' and RequestType=\'{}\' and Status=\'completed\''.format(TTL_search, typeTitle)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'outFields':'OBJECTID'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
for oid_item in response_payload['features']:
objectID = oid_item['attributes']['OBJECTID']
upload_payload = [{
'attributes' : {
'OBJECTID': objectID,
'Status' : 'archived'
}
}]
update_AGOL(upload_payload, edit_token)
except:
print ('Failed to clean entirely')else:
print ('No cleaning required.')return
def queryCount_Purge(TTL_search, typeTitle):
edit_token = get_token()
FS_service = service_URL + 'query/?token={}'.format(edit_token)
where_statement = 'EditDate < \'{}\' and RequestType = \'{}\' and Status in (\'completed\', \'archived\')'.format(TTL_search, typeTitle)
data = urllib.parse.urlencode({'f': 'json', 'where': where_statement, 'returnCountOnly': 'true'}).encode('utf-8')
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)item_count = response_payload['count']
if item_count > 0:
response = None
while response is None:
try:
where_statement = 'EditDate < \'{}\' and RequestType = \'{}\' and Status in (\'completed\', \'archived\')'.format(TTL_search, typeTitle)data = urllib.parse.urlencode({'f': 'json', 'where': where_statement}).encode('utf-8')
FS_service = service_URL + 'deleteFeatures/?token={}'.format(edit_token)
req = urllib.request.Request(FS_service)
response = urllib.request.urlopen(req,data=data)
response_payload = response.read()
response_payload = json.loads(response_payload)
except:
print ('Failed to purge entirely')else:
print ('No purging required.')return
def clean_up(typeTitle, typeTTLTimeIncrement, typeTTLTimeIncrementType, typePurge):
currently_now = datetime.datetime.now()
if typeTTLTimeIncrementType.upper() == 'SECONDS':
typeTTLTimeIncrementType = 'SECOND'
TTL_search = currently_now - datetime.timedelta(seconds = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'MINUTES':
typeTTLTimeIncrementType = 'MINUTE'
TTL_search = currently_now - datetime.timedelta(minutes = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'HOURS':
typeTTLTimeIncrementType = 'HOUR'
TTL_search = currently_now - datetime.timedelta(hours = int(typeTTLTimeIncrement))
elif typeTTLTimeIncrementType.upper() == 'DAYS':
typeTTLTimeIncrementType = 'DAY'
TTL_search = currently_now - datetime.timedelta(days = int(typeTTLTimeIncrement))if typePurge == 0:
print ('Archiving....')
TTL_search = TTL_search.strftime("%Y-%m-%d %H:%M:%S")
print (TTL_search)queryCount_Cleaning(TTL_search, typeTitle)
else:
print ('Purging....')
TTL_search = TTL_search.strftime("%Y-%m-%d %H:%M:%S")
print (TTL_search)queryCount_Purge(TTL_search, typeTitle)
return
#-------------------------------------------------------------------------------
#
#
# MAIN SCRIPT
#
#
#-------------------------------------------------------------------------------update_target = []
complete_target = []special = r'\u'
for request in request_type:
typeID = request[0]
typeTitle = request[1]
typeTimeIncrement = request[2]
typeTimeIncrementType = request[3]
typeEmailActive = request[4]
typeEmailAddresses = request[5]
typeTTLTimeIncrement = request[6]
typeTTLTimeIncrementType = request[7]
typePurge = request[8]payload = gatherdata (typeID, typeTitle, typeTimeIncrement, typeTimeIncrementType)
typeTitle = '{}'.format(typeTitle)
typeTTLTimeIncrement = '{}'.format(typeTTLTimeIncrement)
typeTTLTimeIncrementType = '{}'.format(typeTTLTimeIncrementType)
for item in payload['response']['requests']:
date_created = (datetime.datetime.utcfromtimestamp(int(item['request']['date_created'])).strftime('%Y-%m-%d %H:%M:%S'))
date_created = '{}'.format(date_created)
id = item['request']['id']
id = '{}'.format(id)
status = item['request']['status']
status = '{}'.format(status)
title = '' if '{}'.format(item['request']['title']) == '' else '{}'.format(item['request']['title'])
description = item['request']['description']
description = '' if description == None else description.encode('ascii','ignore')
try:
description = description.replace('\'','')
except:
description = description.decode('utf8')
description = description.replace('\'','')
description = '' if '{}'.format(description) == '' else '{}'.format(description)
address = item['request']['address']
address = '' if address == None else address.encode('ascii','ignore')
try:
address = address.replace('\'','')
except:
address = address.decode('utf8')
address = address.replace('\'','')
address = '' if '{}'.format(address) == '' else '{}'.format(address)
zipcode = 0 if '{}'.format(item['request']['zipcode']) == '' else '{}'.format(item['request']['zipcode'])
latitude = 0 if '{}'.format(item['request']['lat']) == 'None' else '{}'.format(item['request']['lat'])
longitude = 0 if '{}'.format(item['request']['lon']) == 'None' else '{}'.format(item['request']['lon'])
# Other details that may or may not be neededrank = 0 if '{}'.format(item['request']['rank']) == '' else '{}'.format(item['request']['rank'])
user = '' if '{}'.format(item['request']['user']) == '' else '{}'.format(item['request']['user'])
location = item['request']['location']
location = '' if location == None else location.encode('ascii','ignore')
try:
location = location.replace('\'','')
except:
location = location.decode('utf8')
location = location.replace('\'','')
location = '' if '{}'.format(location) == '' else '{}'.format(location)
user_comments = 0 if '{}'.format(item['request']['user_comments']) == '' else '{}'.format(item['request']['user_comments'])
user_follows = 0 if '{}'.format(item['request']['user_follows']) == '' else '{}'.format(item['request']['user_follows'])
user_request = 0 if '{}'.format(item['request']['user_request']) == '' else '{}'.format(item['request']['user_request'])
comment_count = 0 if '{}'.format(item['request']['count_comments']) == '' else '{}'.format(item['request']['count_comments'])followers = 0 if '{}'.format(item['request']['count_followers']) == '' else '{}'.format(item['request']['count_followers'])
supporters = 0 if '{}'.format(item['request']['count_supporters']) == '' else '{}'.format(item['request']['count_supporters'])image_thumbnail = 0 if '{}'.format(item['request']['image_thumbnail']) == '' else '\'{}\''.format(item['request']['image_thumbnail'])
pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive)#try:
# pushToAGOL(date_created, id, status, title, description, address, zipcode ,latitude, longitude, rank, user, location,
# user_comments, user_follows, user_request, comment_count, followers, supporters, image_thumbnail, typeTitle, typeEmailAddresses, typeEmailActive)
#except Exception as pushToAGOL_error:
# error_status = '{}'.format(pushToAGOL_error.args[0])
# send_error(error_status)
clean_up(typeTitle, typeTTLTimeIncrement, typeTTLTimeIncrementType, typePurge)
Thank you so much for sharing, John. I am working through it to modify to our workflow but from what I can tell, this is the solution to our problem!
Awesome! Glad to hear that will work for your needs.