Select to view content in your preferred language

Slow Attachment Downloads from AGOL Feature Layer (photos collected with Field Maps)

1754
8
11-15-2022 04:25 PM
Silas_Mathes
New Contributor

Hello,

We've been using Field Maps (Collector) since 2019 to document our fieldwork with points and photo attachments. The field data are stored/hosted in ArcGIS Online Feature Layers. Our workflow is to collect the data to AGOL and then use an arcpy script to download the photos locally. Starting around August, the download time for each photo attachment (typically either 200KB or 900KB, depending on the Field Maps photo resolution setting) slowed to around 30 seconds.  Previously the download time was around 2-3 seconds per attachment.  We can collect several hundred photos for large sites, and downloading our photos now takes hours of time. 

Has anyone else experienced slow attachment downloads from AGOL Feature Layers?  If so are you just living with the delay, or did you find a work around?

We've tested various internet connections at home and in the office and double-checked our scripts, which utilize the following code snippet:

from arcgis.gis import GIS
gis = GIS("URL", "username","password")
search_result = gis.content.search("AGOL LAYER", "Feature Layer")

feature_layer_collection = search_result[1]
feature_layer = feature_layer_collection.layers[0]

feature_layer.attachments.download(oid=XXXXXX, attachment_id=XXXXX)

 

Thanks for any help!

 

Silas Mathes

Davey Resource Group, Nashville, TN   

8 Replies
ArcDevelopment
Emerging Contributor

Did you ever get an answer to this? Im finding that a single 1.7MB png takes around 12 seconds to download which is really slowing our software.

0 Kudos
SilasMathes
Emerging Contributor

We haven't experienced the slowdown since we started with fresh, empty AGOL hosted feature layers to collect our data/photos around January of this year. Our hosted layers are building back up in size (biggest currently has around 500 attachments) as we collect field data, so I'll update you if we experience slow downloads again.  How many records/attachments are already in your hosted layers?    

0 Kudos
sakurai
New Contributor

The attachment download method is very time consuming. (ArcGIS API for Python 2.3.0)
So I implemented attachment download using REST API.

for feature in features:
    attachments = fLayer.attachments.get_list(oid=feature.attributes[oidField])
    for attachment in attachments:
        attachmentUrl = "{0}/{1}/attachments/{2}".format(inputRecordSet["url"], feature.attributes[oidField], attachment["id"])    
        response = requests.get(attachmentUrl, params={
            "token": gis._con.token
        })
        attachement_path = os.path.join(attachement_dir, attachment["name"])
        with open(attachement_path, 'wb') as saveFile:
            saveFile.write(response.content)
CalvinHarmin
Frequent Contributor
lyr = FeatureLayer(url=layer_url, gis=gis) # establish a FeatureLayer object from AGOL source
attachment_manager = lyr.attachments
attachment_list = attachment_manager.get_list(objectid) # this is a list, and each item in the list is a small dict of attachment attributes
if attachment_list: # ignores a blank (empty) list
for attachment in attachment_list:
# download attachment
attachment_path = attachment_manager.download(objectid, attachment["id"]) # this returns a list but it is a single string (path)

I've tried to boil down my code I'm using to the above. I am experiencing an issue where, for example, a Survey123 form submission (saves to an AGOL  hosted feature layer) has 4 attachments and it is taking > 15 minutes to download all of the photos. Each photo is about 9mb. Each one takes > 3 minutes to download. This seems extremely slow. I haven't tried @sakurai workaround but I just wanted to put in my experience about the slow download behavior. 

SilasMathes
Emerging Contributor

@CalvinHarmin I can confirm that @sakurai 's method of using the standard Python requests library is much faster for attachment downloads--the only difficulty I had in implementation was finding the correct way to reference a valid token.  I ended up using: 

token = arcpy.GetSigninToken()
token = token["token"]

Let me know if you're interested and I can share our script.   ESRI reported some server issues yesterday (3/5/2025) for hosted feature layers and we couldn't download anything.  I think that's resolved now, though. 

RhettZufelt
MVP Notable Contributor

Have you looked at the photo sizes for the latest points collected? 

I ask as I have a project in Field Maps that set to capture Small sized photos, and for some reason, randomly at some point, it started collecting them in the device default size (large as they get).

This made the images so large they sometimes timed out trying to submit and took forever to download them after that.

Looked in Field Maps Designer, and was still set to Small.  Set it to Large and save, then back to Small and save and it 'fixed' the issue and started saving the small sized photos again.  At least, until the next time it changed itself.

So, now I routinely check some of the newly collected photo sizes in the Hosted Feature Class, and if/when they start getting huge again, I repeat the FM Designer sequence to 'fix' it.

R_

SilasMathes
Emerging Contributor

Thanks Rhett--we'll check our attachment sizes (and settings) frequently from now on!

ConradSchaefer__DOIT_
Regular Contributor

We too have been struggling with the slow download() performance on attachments. It has become more than a nuisance as our customer is now hitting timeout issues. The solution proposed by sakurai has drastically reduced the download time.

Our scenario

We have a survey123 survey for field workers to use on site visits. They can take up to 50 photos for a visit. The current highest our customer has uploaded is 30 photos, so the max hasn't been tested and they are already hitting timeout errors for 30 photos. A note, our first attempt at solving the problem, before understanding the download() step was the problem, was service tuning.

The photos are requested by a custom geoprocessing service on ArcGIS Server that creates PDF reports containing text data pages, photosheet pages, and a map of the site with accompanying points of photo locations. The downloaded image files are all under 500 KB in size and most are down around 250-300 KB. There is very little data being downloaded.

We were using the feature layer > attachments > download() functionality in the ArcGIS API for Python. I believe this is the API resource https://developers.arcgis.com/python/latest/api-reference/arcgis.features.managers.html#arcgis.featu... 

Solution

The report process is now using the requests module to make get requests directly for the attachment images and the download times are way better. I did some timing on different environments (local computer and servers) with different tools (Pro, Jupyter Notebook) and the completion time for downloading the 30 photos decreased about 70-80 percent. 

Taking sakurai's code above and tweaking it slightly for our purpose we roughly have this design

survey_records_item_id = "xyzxy"
survey_records_item = gis.content.get(survey_records_item_id)

feature_layer_0 = survey_records_item.layers[0]  # survey records
feature_layer_1 = survey_records_item.layers[1]  # photos

example_survey_globalid = "xyzxy"
relevant_photos_sdf = survey_photos_sdf.query(f"parentglobalid == '{example_survey_globalid}'")

for oid in relevant_photos_sdf["objectid"].tolist():
    attachments = feature_layer_1.attachments.get_list(oid=oid)
    for attachment in attachments:
        attachmentUrl = "{0}/{1}/attachments/{2}".format(feature_layer_1.url, oid, attachment["id"])
        response = requests.get(attachmentUrl, params={"token": gis._con.token})
        attachment_path = os.path.join(r'./xyzx', attachment["name"])
        with open(attachment_path, 'wb') as save_file:
			save_file.write(response.content)

One final thought, I have not encountered any thrown exceptions yet but am monitoring for them to improve the code with some exception handling.