Select to view content in your preferred language

How to automate AGOL backups from a specific online folder

1103
4
11-09-2022 02:46 PM
Labels (2)
Hatsocks
New Contributor III

Hello folks,

I need to do batch backups of my online data.  I used this page as a guide to create a script in a notebook:

https://support.esri.com/en/technical-article/000022524

The script works well, it exports an FGDB for each feature layer and downloads it to a local folder.  But I don't necessarily want to back up everything I have online every time I run the script.  I would like to be able to specify just this or that folder to be backed up.  Can anyone suggest how I would adapt the script to do that?  I am totally new to Python so any help will be appreciated.  Thank you!

Tags (3)
0 Kudos
4 Replies
DougBrowning
MVP Esteemed Contributor

I just use a simple txt config file like this

NameToCallIt,713e3aaef9677d333b618
secondone,itemid
thirdone,itemid

I read in the name to call the output file and the itemid

then my script just reads that and does a loop.  You can swap out any config file you want on each run.

 

# vars for config file and output dir---------
# Setup inputs
configFile = r"C:\ArcProBackup.txt"
backupDir = r"c:\HFSBackups"
#--------------------------------------------

# Make a connection to ArcPro
gis = GIS('pro')

# read in the config file for list of HFS for Survey123
BackupAllAGOHFSConfigFile = open(configFile, 'r')
HFSList = BackupAllAGOHFSConfigFile.read().splitlines()
BackupAllAGOHFSConfigFile.close()


for HFS in HFSList:
    HFSname = HFS.split(",")[0]
    itemId = HFS.split(",")[1]          # this now comes from the main AGO page with all the settings.

    # Start the export to GDB job
    print ("Export job started for " + HFSname)
    fsLink = gis.content.get(itemId)
    result = fsLink.export("tempOut"+ HFSname, "File Geodatabase")

    # Save to file system
    dt = time.strftime("%Y%m%d_%H%M%S")
    out_file = os.path.join(backupDir,"{}_{}.zip".format(HFSname,dt))
    print ("Saving final downloaded FGDB to {}...".format(out_file))
    result.download(backupDir, out_file)

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the export file from AGOL")
    deleteResult = result.delete()
    print ("Delete result is " + str(deleteResult))

 

 Hope that helps

Hatsocks
New Contributor III

Thank you for the quick reply.  I really am an absolute beginner with Python, so I might be totally misunderstanding your script, but it looks to me as if it specifies a folder to put the downloads into.  I would like to specify an AGOL folder to download HFSs from.  Or have I completely  missed the point?  Wouldn't be the first time...

0 Kudos
DougBrowning
MVP Esteemed Contributor

The backup script posted is for hosted services and takes the ItemID of the service.  Makes no difference what folder they are in.  There might be a way to read a folder and get a list of any services in it but I have no idea. 

0 Kudos
Hatsocks
New Contributor III

Ok, thanks for trying! 🙂  I might try generating an Item Report and see if I can identify just the items in a certain folder, and get the ItemIDs from that, but if anyone else has any ideas I'd love to hear them!  Thanks.

0 Kudos