Select to view content in your preferred language

Looking for a code example of using Python REST API for applyEdits

144
6
Jump to solution
a week ago
DonMorrison1
Frequent Contributor

Does anybody have an Python code example of submitting an applyEdits to the ArcGIS REST API? I'm spending way too much time trying to get a very simple update to work.   I tried to use the API for Python but ran into a problem with that so I thought the REST API might work better. I think my problem is how I'm encoding the parameters but not sure. 

I did a browser trace doing the update manually with the ArcGIS REST Services Directory tool and this is the form data that gets sent - I'm just not sure how to make that happen with Python.

adds=&updates=%5B%7B%22attributes%22%3A+%7B%22objectid%22%3A+zz%2C+%22is_copied%22%3A+1%7D%7D%5D&deletes=&gdbVersion=&rollbackOnFailure=true&useGlobalIds=false&returnEditMoment=false&trueCurveClient=true&attachments=&timeReferenceUnknownClient=false&datumTransformation=&editsUploadId=&async=false&returnEditResults=true&f=html

 

Tags (2)
0 Kudos
1 Solution

Accepted Solutions
DonMorrison1
Frequent Contributor

I finally figured it out. Here is some code that does work:

import urllib
import urllib.request
import json
from urllib.parse import urlencode 

url = '<my_url>/FeatureServer/0/applyEdits'
updates = [
      {
            "attributes": {"objectid": 2, "is_copied": 0} 
      }
]
params = dict()
params['updates'] = updates
params['f'] = 'json'
req = urllib.request.Request(url, urlencode(params))
req.data = req.data.encode('utf-8')
response = urllib.request.urlopen(req) 
data = json.load(response)

View solution in original post

6 Replies
DonMorrison1
Frequent Contributor

I finally figured it out. Here is some code that does work:

import urllib
import urllib.request
import json
from urllib.parse import urlencode 

url = '<my_url>/FeatureServer/0/applyEdits'
updates = [
      {
            "attributes": {"objectid": 2, "is_copied": 0} 
      }
]
params = dict()
params['updates'] = updates
params['f'] = 'json'
req = urllib.request.Request(url, urlencode(params))
req.data = req.data.encode('utf-8')
response = urllib.request.urlopen(req) 
data = json.load(response)
AustinAverill
Frequent Contributor

This looks pretty good! The only thing that you might need to do is generate a token for access in the event that the service requires authentication. That request would look like this, then you would just need to add 'token' to the params.

url = "https://arcgis.com/sharing/rest/generateToken"
d= {
    'username' : un,
    'password' : pw,
    'client' : 'referer',
    'referer' : 'https://.arcgis.com',
    'f' : 'json'
}
token_resposne = urllib.request.Request(url, urlencode(d))
token = token_response.json()['token']
HaydenWelch
MVP Regular Contributor

I was gonna say you usually want to encode that information in the headers, but apparently ArcGIS expects it to be sent as request params, weird... Seems like a great way to open up a Parameter Tampering vector . They don't seem to use headers for much.

0 Kudos
AustinAverill
Frequent Contributor

Yeah, one would think. But here we are lol.

HaydenWelch
MVP Regular Contributor

I believe that the httpx library has been added to the latest version of arcpy's environment:

import httpx
import asyncio


def get_requests(url: str, edits: list[httpx.QueryParams], client: httpx.AsyncClient):
    """Build requests using the edit params and the provided client"""
    yield from [client.build_request("POST", url, params=update) for update in edits]


async def apply_edits(url: str, edits: list[httpx.QueryParams]):
    async with httpx.AsyncClient() as client:
        # Add tasks to running async loop
        tasks = [client.send(request) for request in get_requests(url, edits, client)]
        print(f"Applying {len(tasks)} edits...")

        # Await the responses asyncronously (send them all at once and wait for them to respond)
        responses = await asyncio.gather(*tasks, return_exceptions=True)

        # Filter the responses and check for failed transactions
        valid_responses: list[httpx.Response] = []
        for response in responses:
            if not isinstance(response, httpx.Response):
                print(f"[ERROR]: {response}")
            elif response.status_code != 200:
                print(f"[ERROR]: <{response.status_code}> {response.reason_phrase}")
            else:
                valid_responses.append(response)

        print(f"Successfully applied {len(valid_responses)} edits!")

    # Return the json values for all responses
    return [r.json() for r in valid_responses]


async def main():
    URL = "<my_url>/FeatureServer/0/applyEdits"
    
    # Convert the dict to a QueryParams object (optional), QueryParams is just
    # a MultiDict so x['a'] = 1 follwed by x['a'] = 2 creates a dict with x['a'] = [1, 2]
    updates: list[dict] = [{"attributes": {"objectid": 2, "is_copied": 0}}]
    edits: list[dict] = [{'updates': updates, 'f' : 'json'}]
    queries = list(map(httpx.QueryParams, edits))
    
    # Dispatch and await the responses
    responses = await apply_edits(URL, queries)
    
    # Do something with the repsonses
    print(responses)


if __name__ == "__main__":
    asyncio.run(main())

If you need to deal with tons of queries like this, it's definitely worth looking into as it allows you to add the requests to an event loop ([send, send, send -> wait, wait wait] instead of [send -> wait, send -> wait, send -> wait]) so you can dispatch as many as the server can handle at once.

DonMorrison1
Frequent Contributor

My current use case is very low volume but your example looks like a better alternative to some other work I've done using the multiprocessing package to spin off parallel requests.