Select to view content in your preferred language

Any way to download submitted responses as .json payloads?

339
4
Jump to solution
01-28-2025 03:00 PM
MattEdrich
Frequent Contributor

Hello everybody,

A fun trick I learned some months back is to use PowerAutomate, or a similar tool, to automatically archive submitted form responses as JSON payloads in a location external to Survey123 (e.g. SharePoint). As long as the associated feature service remains intact, any attachments connected to any particular payload can be retrieved following URLs in the JSON. Might not be something everybody needs, but this little "Submission Archive" has been particularly handy for me since I regularly need to work with survey data coming from multiple forms.

Unfortunately, I didn't discover this trick until my current feature services were a couple months into...service. While PowerAutomate is able, in some cases, to re-use flow triggers from the past 30 days, for me this means that there are still 2-3 months of submissions to multiple feature services that are not represented in my archive.

I would really like to get these submissions downloaded onto my local machine so that I can update some aspects of them that make them "legacy". Does anyone know of ways to achieve this? I don't have the option to edit and resend every submission in this case. I also haven't had great luck exporting feature services as geodatabases, but probably could manage an approach that requires such if I had some guidance/assistance. Really though, I am hoping there is a simpler path!

Thanks!

0 Kudos
1 Solution

Accepted Solutions
DougBrowning
MVP Esteemed Contributor

The item page in AGOL will let you download all kinds of formats including JSON.  I use it all the time and it works great up to about 10 GB then it gets cranky.  

I also have a nightly python backup script that backups my AGOL service to GDB.  It does about 15 every night.  I can even backup by layer to remove attachments and cut the size down.

You may be able to do some things with ArcGIS for Sharepoint but not sure there.  I would also look into ArcGIS for Excel if you want to see multiple forms and tables all at once.  Just make a map of all your forms then in Excel click and they add in.  Plus its all live time for updates.  You can even edit right there.

Hope that helps

View solution in original post

0 Kudos
4 Replies
DougBrowning
MVP Esteemed Contributor

The item page in AGOL will let you download all kinds of formats including JSON.  I use it all the time and it works great up to about 10 GB then it gets cranky.  

I also have a nightly python backup script that backups my AGOL service to GDB.  It does about 15 every night.  I can even backup by layer to remove attachments and cut the size down.

You may be able to do some things with ArcGIS for Sharepoint but not sure there.  I would also look into ArcGIS for Excel if you want to see multiple forms and tables all at once.  Just make a map of all your forms then in Excel click and they add in.  Plus its all live time for updates.  You can even edit right there.

Hope that helps

0 Kudos
MattEdrich
Frequent Contributor

Wow I can't believe I missed the GeoJSON export - thanks Doug! Experimenting with a few of those exports I see I will have my work cut out for me when it comes to reconstructing submissions that end up spread out across multiple feature servers (repeats) within a single form. I definitely will be exploring your other suggestions, but I am guessing now that working through some ArcGIS Notebooks within ArcPro might actually be the best way to go. At least I now have jumping-off point!

0 Kudos
DougBrowning
MVP Esteemed Contributor

Code for it is not too bad.   Excel is prob the easiest.  Or even a Dashboard.

import os, time
from arcgis.gis import GIS
import tempfile
# Setup inputs
configFile = r"config.txt"
# config format is just FriendlyName,ServiceItemID
backupDir = r""
#--------------------------------------------

# Make a connection to ArcPro
gis = GIS('pro')

# read in the config file for list of HFS for Survey123
BackupAllAGOHFSConfigFile = open(configFile, 'r')
HFSList = BackupAllAGOHFSConfigFile.read().splitlines()
BackupAllAGOHFSConfigFile.close()

for HFS in HFSList:
    HFSname = HFS.split(",")[0]
    itemId = HFS.split(",")[1]          # this now comes from the main AGO page with all the settings.

    # Start the export to GDB job
    dt = time.strftime("%Y%m%d_%H%M%S")
    #out_file = os.path.join(backupDir,"{}_{}".format(HFSname,dt))
    out_file = os.path.join(backupDir,"{}_{}.zip".format(HFSname,dt))
    print ("Export job started for " + HFSname + dt)
    fsLink = gis.content.get(itemId)
    result = fsLink.export("export" + HFSname + dt, "File Geodatabase")

    # Save to file system
    print ("Saving final downloaded FGDB to {}...".format(out_file))
    result.download(backupDir, out_file)

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the export file from AGOL")
    deleteResult = result.delete()
    print ("Delete result is " + str(deleteResult))

 

0 Kudos
MattEdrich
Frequent Contributor

You're right, the code wasn't too bad! Though, I did have some annoying challenges with just getting Jupyter Notebooks within ArcGIS Pro to work. Ultimately I opted for the MVP version of building out a script (.ipynb, really) in VSCode that allowed me to connect to my Org's AGOL portal, locate specific feature services based on input, and then parse each record in the feature service and export them as JSON. This approach worked really great for me since I already had some examples of JSON payloads that I could use to model the data structure I needed - and, since I was able to get all attachment URLs for all nested tables within the feature service, I was able to make sure that every piece of data we collected in 2024 was properly updated without too much manual effort.

Thanks for the encouragement Doug!