Select to view content in your preferred language

Downloading Photos

419
3
06-19-2023 10:27 PM
snowflake
New Contributor II

Hi,

I am using Survey123 and the data collection layer is getting pretty large. With every entry, there are about 6 photos associated with it.
It sad to see that there is no easy way to download the images published on AGOL with the survey.

I tried batch downloading from ArcGIS Pro - it didn't work.
The only code that actually worked was I believe published on this forum: https://community.esri.com/t5/arcgis-survey123-blog/survey123-tricks-of-the-trade-download-survey-da...
which does the job great, but it takes a long time to download all attachments. It's also great that it produces csv with the list of attachments that belong to a specific entry

Here comes in play my lack of knowledge in python and why I wish there was a simpler way of downloading attachments:
is there a way to implement in the code a filter for downloading attachments from a specific date range, or a specific user, or a specific locality?


That would help produce a backup quicker and more frequent.

3 Replies
DougBrowning
MVP Esteemed Contributor

If you want to make backup copies I use this script.  It exports the entire service with all layers, photos, etc as a GDB every night.  I just schedule it as a batch file in windows.   It tends to start failing at 10GB and I have a ticket out on that for some time now.

This one just does the entire service.

Config file is just the name you want to call it and the itemid of the service

Lotic_2022_Service_Backup,713e3aaef9674e3493a64347d333b618

 

import os, time
from arcgis.gis import GIS

# vars for config file and output dir---------
# Setup inputs
configFile = r"your path"
#backupDir = r"your path"
backupDir = r"your path"
#--------------------------------------------

# Make a connection to ArcPro
#gis = GIS('pro',verify_cert=False)
gis = GIS('pro')

# read in the config file for list of HFS for Survey123
BackupAllAGOHFSConfigFile = open(configFile, 'r')
HFSList = BackupAllAGOHFSConfigFile.read().splitlines()
BackupAllAGOHFSConfigFile.close()


for HFS in HFSList:
    HFSname = HFS.split(",")[0]
    itemId = HFS.split(",")[1]          # this now comes from the main AGO page with all the settings.

    # Start the export to GDB job
    dt = time.strftime("%Y%m%d_%H%M%S")
    out_file = os.path.join(backupDir,"{}_{}".format(HFSname,dt))
    print ("Export job started for " + HFSname + dt)
    fsLink = gis.content.get(itemId)
    result = fsLink.export("export" + HFSname + dt, "File Geodatabase")

    # Save to file system


    print ("Saving final downloaded FGDB to {}...".format(out_file))
    result.download(backupDir, out_file)

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the export file from AGOL")
    deleteResult = result.delete()
    print ("Delete result is " + str(deleteResult))

 

 

This one takes a list of specific layers you want so you can just do photos.

this config just add the list

Lotic_2022_Service_Backup;713e3aaef9674e3493a64347d333b618;{"layers":[{"id":0},{"id":1},{"id":2},{"id":3},{"id":4},{"id":5},{"id":6},{"id":7},{"id":8},{"id":9},{"id":10},{"id":11},{"id":12},{"id":13},{"id":16},{"id":17},{"id":18},{"id":21},{"id":22},{"id":27},{"id":28},{"id":29},{"id":30},{"id":31},{"id":32},{"id":33},{"id":34},{"id":35}]}

 

import os, time
from arcgis.gis import GIS
# vars for config file and output dir---------
# Setup inputs
configFile = r"your path"
backupDir = r"your path"
#--------------------------------------------

# Make a connection to ArcPro
#gis = GIS('pro',verify_cert=False)
gis = GIS('pro')

# read in the config file for list of HFS for Survey123
BackupAllAGOHFSConfigFile = open(configFile, 'r')
HFSList = BackupAllAGOHFSConfigFile.read().splitlines()
BackupAllAGOHFSConfigFile.close()


for HFS in HFSList:
    HFSname = HFS.split(";")[0]
    itemId = HFS.split(";")[1]          # this now comes from the main AGO page with all the settings.
    exportParameters =  HFS.split(";")[2]    # now added to do all but photos due to size
    #exportParameters =  '{"layers":[{"id":0},{"id":2}]}'

    # Start the export to GDB job
    dt = time.strftime("%Y%m%d_%H%M%S")
    out_file = os.path.join(backupDir,"{}_{}".format(HFSname,dt))
    print ("Export job started for " + HFSname + dt)
    fsLink = gis.content.get(itemId)
    result = fsLink.export("export" + HFSname + dt, "File Geodatabase", exportParameters)

    # Save to file system
    dt = time.strftime("%Y%m%d_%H%M%S")
    out_file = os.path.join(backupDir,"{}_{}.zip".format(HFSname,dt))
    print ("Saving final downloaded FGDB to {}...".format(out_file))
    result.download(backupDir, out_file)

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the export file from AGOL")
    deleteResult = result.delete()
    print ("Delete result is " + str(deleteResult))

 

 I backup all our production services each night using this script for the last 5 years now.  Mostly works but I really wish AGOL had backups!  It needs them to be enterprise class really.

Hope that helps

snowflake
New Contributor II

Thanks Doug.

The code that was in the blog works, similar like yours, but I just had a request yesterday, "Hey, can you export the photos that John submitted?".
Right now there are 600 photos and took 2 hours to download them all, and of these, John had 30.
That number (600) will double in a week.

Once I download photos, I actually store them elsewhere so no need to download all every single time. With the speed of data collection, in a few months it will take days to download all photos.
If I could download just specific photos every two week, or photos that were submitted in the past two weeks,  that would be a perfect solution for me.

If there isn't a solution, then we will likely have to stop collecting the photos through Survey123/AGOL and move them from phone to our server

0 Kudos
DougBrowning
MVP Esteemed Contributor

I gave this code because you said you wanted the entire set of photos in a export DB.  

If you want just some photos the the script you linked to or my version also in that post would work.

To just get certain photos you can add a where clause to this line on my code

with arcpy.da.SearchCursor("attachView", fieldList, Add your where clause here) as cursor:
   for row in cursor:
    binaryRep = row[0]

What we do is leave the photos in the service until the end of the season then we download all of them and put them in our permanent spot on our web server.  I think this does work out better as the service can get too big and we take down the field season service.  During the year we simply link to them where they are or use things like Ops Dashboard where we have a full photo review dashboard.

AGOL is designed to use the photos in place which works in many applications.  Sounds like your users want them the old school way.  That is understandable but it will be some manual work to download on the regular.  I suggest at least testing out some more modern ways to view and use the photos.

hope that helps

this is our photo review dashboard

Photo Dashboard.gif

0 Kudos