Add attachment larger than 10 MBs

1250
4
02-22-2018 01:31 PM
JakeSkinner
Esri Esteemed Contributor

Has anyone had success uploading an attachment larger than 10 MBs to an ArcGIS Online hosted feature service?  I know that this is not possible via a web browser.  I'm trying to upload a video (.mp4) around 50 MBs.  I broke the video into smaller chunks so that each .mp4 is less than 10 MBs.  I tried the following python script using the uploadPart and addAttachment REST functions to achieve this, but keep receiving an error in the addPart function:

{u'error': {u'message': u'', u'code': 400, u'details': [u'Unable to upload item.']}}

Below is my script.  Any help is appreciated.

import requests, json, os

# Disable warnings
requests.packages.urllib3.disable_warnings()

# Variables
username = "jskinner_Tred"
password = "****"
id = 1252
TempDir = r"D:\Projects\Treddyfrin\Temp"

# Generate AGOL Token
tokenURL = 'https://www.arcgis.com/sharing/rest/generateToken'
params = {'f': 'pjson', 'username': username, 'password': password, 'referer': 'http://www.arcgis.com', 'expiration': 21600}
r = requests.post(tokenURL, data = params, verify=False)
response = json.loads(r.content)
token = response['token']

# Register Item
print("Registering Item")
uploadURL = 'https://services2.arcgis.com/o9a0D2ZwwG8CDFVQ/arcgis/rest/services/Sewers/FeatureServer/uploads/regi...'
params = {'f': 'pjson', 'token': token, 'itemName': 'attachment1252', 'description': 'Attachment for feature 1252'}
r = requests.post(uploadURL, data = params, verify=False)
response = json.loads(r.content)
itemID = str(response['item']['itemID'])

# Upload Parts of video to Registered Item
def addPart(itemID, token, filename, num):
    '''Add part of the input video using a post request through the request module'''

    print("Adding parts of video")

    addpartURL = 'https://services2.arcgis.com/o9a0D2ZwwG8CDFVQ/arcgis/rest/services/Sewers/FeatureServer/uploads/' + itemID + '/uploadPart'
    params = {'token':token, 'f':'pjson', 'partNumber': str(num)}
    image = {'partFile': (os.path.basename(filename), open(filename, 'rb'), 'video/mp4')}
    response = requests.post(addpartURL, data = params, files = image)
    print(json.loads(response.text))

# Walk through temp directory to get each video part
for a, b, c in os.walk(TempDir):
    for name in c:
        print(name)
        index = int(name.replace('output', '').replace('.mp4', ''))
        addPart(itemID, token, os.path.join(TempDir,name), index)

# Commit upload
print("Commiting upload")
commitURL = 'https://services2.arcgis.com/o9a0D2ZwwG8CDFVQ/arcgis/rest/services/Sewers/FeatureServer/uploads/' + itemID + '/commit'
params = {'f': 'pjson', 'token': token}
response = requests.post(commitURL, data = params)
print(json.loads(response.text))

# Add attachment to feature
print("Adding attachment to feature")
fsURL = 'https://services2.arcgis.com/o9a0D2ZwwG8CDFVQ/ArcGIS/rest/services/Sewers/FeatureServer/0/' + str(id) + '/addAttachment'
params = {'f': 'pjson', 'token': token, 'uploadId': itemID}
response = requests.post(fsURL, data = params)
print(response.content)
0 Kudos
4 Replies
joerodmey
MVP Alum

Jake Skinner‌ did you find a way to upload larger than 10MB?

0 Kudos
JakeSkinner
Esri Esteemed Contributor

joe rodmey‌ I have tested, but try the following:

Layer Attachments | ArcGIS for Developers 

This leverages the new ArcGIS API for Python.

0 Kudos
TorrinHultgren2
New Contributor III

We found that although the stated limit is 10MB, we received that same generic error when we attempted to upload chunks as small as 4MB, and that we only were able to reliably and consistently succeed with multipart uploads when the chunks were around 3MB. Our guess is that mutipart-form encoding is simply far less compact, so when a file or part of a file is converted to form-encoded UTF-8 it's much bigger and hits the 10MB cutoff sooner. That means that although you can get right up to 10MB for a standard upload, once you cross that 10MB threshold, you have to upload in much smaller chunks because of the multipart encoding. That said, it's not the end of the world once you've got the plumbing for multipart uploads coded, just set the chunk limit smaller.  Our simplified code block is below, hope it helps.

# Upload large file to Feature Service function
def uploadMultipartFeatures(token, feat_service, file):
    import time, requests, os
    """ Add the item to the portal in chunks."""
    def read_in_chunks(file_object, chunk_size=3000000):
        """Generate file chunks of 3MB"""
        while True:
            data = file_object.read(chunk_size)
            if not data:
                break
            yield data
    try:
        title = os.path.basename(file)
        # First need to register multipart upload
        url = feat_service + "/uploads/register"
        params = {"description": "Multipart Upload",
                     "f": "json",
                     "token": token,
                     'itemName': title}
        response = requests.post(url, data=params)
        json_response = response.json()
        print("response was",json_response)
        if 'success' in json_response.keys():
            success = json_response['success']
        if not success:
            return json_response
        itemID = json_response["item"]["itemID"]

        # With item registered, now begin multipart upload
        url = feat_service + "/uploads/{}/uploadPart".format(itemID)
        with open(file, 'rb') as f:
            for part_num, piece in enumerate(read_in_chunks(f), start=1):
                params = {"f": "json", "partId":str(part_num)}
                fileObj = {"file": piece}
                print("posting",url,params,file)
                response = requests.post(url, data=params, files=fileObj)
                json_response = response.json()
                print("response was",json_response)
                if 'success' in json_response.keys():
                    success = json_response['success']
                if not success: break # If one part fails completely, don't bother with the rest.

        if success: # Only try to commit if we had success
            # After all uploads are complete, it's necessary to commit the
            url = feat_service + "/uploads/{}/commit".format(itemID)
            params = {"f": "json", "token": token}
            print("posting",url,params)
            response = requests.post(url, data=params)
            json_response = response.json()
            print("response was",json_response)
        return json_response
    except Exception as e:
        raise ValueError("Problem with uploadFeatures. Url: {} Params: {} Response: {} Exception: {}".format(url, params, response.text, e))‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
JakeSkinner
Esri Esteemed Contributor

This is very helpful, thank you for thisTorrin.  I also found that you can upload an attachment larger than 10 MBs using the ArcGIS API for Python.  See the following link:

Layer Attachments | ArcGIS for Developers 

0 Kudos