Select to view content in your preferred language

How to automate AGOL backups from a specific online folder

2011
5
11-09-2022 02:46 PM
Hatsocks
Emerging Contributor

Hello folks,

I need to do batch backups of my online data.  I used this page as a guide to create a script in a notebook:

https://support.esri.com/en/technical-article/000022524

The script works well, it exports an FGDB for each feature layer and downloads it to a local folder.  But I don't necessarily want to back up everything I have online every time I run the script.  I would like to be able to specify just this or that folder to be backed up.  Can anyone suggest how I would adapt the script to do that?  I am totally new to Python so any help will be appreciated.  Thank you!

Tags (3)
0 Kudos
5 Replies
DougBrowning
MVP Esteemed Contributor

I just use a simple txt config file like this

NameToCallIt,713e3aaef9677d333b618
secondone,itemid
thirdone,itemid

I read in the name to call the output file and the itemid

then my script just reads that and does a loop.  You can swap out any config file you want on each run.

 

# vars for config file and output dir---------
# Setup inputs
configFile = r"C:\ArcProBackup.txt"
backupDir = r"c:\HFSBackups"
#--------------------------------------------

# Make a connection to ArcPro
gis = GIS('pro')

# read in the config file for list of HFS for Survey123
BackupAllAGOHFSConfigFile = open(configFile, 'r')
HFSList = BackupAllAGOHFSConfigFile.read().splitlines()
BackupAllAGOHFSConfigFile.close()


for HFS in HFSList:
    HFSname = HFS.split(",")[0]
    itemId = HFS.split(",")[1]          # this now comes from the main AGO page with all the settings.

    # Start the export to GDB job
    print ("Export job started for " + HFSname)
    fsLink = gis.content.get(itemId)
    result = fsLink.export("tempOut"+ HFSname, "File Geodatabase")

    # Save to file system
    dt = time.strftime("%Y%m%d_%H%M%S")
    out_file = os.path.join(backupDir,"{}_{}.zip".format(HFSname,dt))
    print ("Saving final downloaded FGDB to {}...".format(out_file))
    result.download(backupDir, out_file)

    # Remove the extracted FGDB from AGOL (cleanup)
    print ("Removing the export file from AGOL")
    deleteResult = result.delete()
    print ("Delete result is " + str(deleteResult))

 

 Hope that helps

Hatsocks
Emerging Contributor

Thank you for the quick reply.  I really am an absolute beginner with Python, so I might be totally misunderstanding your script, but it looks to me as if it specifies a folder to put the downloads into.  I would like to specify an AGOL folder to download HFSs from.  Or have I completely  missed the point?  Wouldn't be the first time...

0 Kudos
DougBrowning
MVP Esteemed Contributor

The backup script posted is for hosted services and takes the ItemID of the service.  Makes no difference what folder they are in.  There might be a way to read a folder and get a list of any services in it but I have no idea. 

0 Kudos
Hatsocks
Emerging Contributor

Ok, thanks for trying! 🙂  I might try generating an Item Report and see if I can identify just the items in a certain folder, and get the ItemIDs from that, but if anyone else has any ideas I'd love to hear them!  Thanks.

0 Kudos
PhilipOhlinger
Occasional Contributor

Here is a script I've been using to backup AGOL data. I used GOVAI to help me write this, and it works like a charm. This script creates a log file to keep track of the backups each day, and has some basic error handling.

You need to replace the following in the script with your own variables: 

 #BASEFOLDERPATH
#PORTAL

#AGOL USERNAME

#AGOL PASSWORD

I wrote this script so it would only look for feature services that I have tagged with "DailyBackup" in my AGOL account. You can change this to look for a different tag by editing the section of code that defiles the tag.

I used Windows Task Scheduler to have this run every day. When you create the task, for the action choose "Start a program", and for the program/script choose the location of your ArcGIS Pro python installation (similar to this: "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\python.exe") and then for the "add arguments" put the path to this python script, like this: "C:\Users\username\GIS\Scripts\Python Scripts\AGOL_Daily_Backup.py"

 

from time import strftime  
from arcgis.gis import GIS  
import logging  
import os  
  
# Initialize logging  
log_file = r"#BASEFOLDERPATH\backup_log.txt"  
logging.basicConfig(filename=log_file, level=logging.INFO)
  
def log_message(message):  
    logging.info(f"{strftime('%Y-%m-%d %H:%M:%S')} - {message}")  
  
# Input Variables  
base_output_dir = r"#BASEFOLDERPATH"  
date_str = strftime('%Y_%m_%d')  
  
# Create Backup Function  
def backup_feature_services():  
    try:  
        # Log into AGOL  
        gis = GIS('https://#PORTAL.maps.arcgis.com/', '#AGOL USERNAME', '#AGOL PASSWORD')
          
        # Search for items with the tag 'DailyBackup'  
        search_query = "tags:DailyBackup"  
        items = gis.content.search(query=search_query, item_type='Feature Service')  
  
        if not items:  
            log_message("No items found with the tag 'DailyBackup'.")
            return  
  
        # Create a folder for today's date  
        date_folder_path = os.path.join(base_output_dir, date_str)  
        if not os.path.exists(date_folder_path):  
            os.makedirs(date_folder_path)  
  
        for item in items:  
            try:  
               
                log_message(f"Processing item: {item.title} (ID: {item.id})")  
  
                # Export the item to File Geodatabase  
                tempfile = f"{item.title}_{date_str}"  
                export_item = item.export(tempfile, 'File Geodatabase', parameters=None, wait=True)  
  
                # Search and download the exported File Geodatabase  
                myexport = gis.content.search(tempfile, item_type='File Geodatabase')  
                if myexport:  
                    fgdb = gis.content.get(myexport[0].id)  
                    fgdb.download(save_path=date_folder_path)  
                    fgdb.delete()  
                    log_message(f"Successfully backed up: {item.title} to {date_folder_path}")  
                else:  
                    log_message(f"Error: Export for {item.title} not found.")  
              
            except Exception as e:  
                log_message(f"Error processing item {item.title}: {str(e)}")  
          
        log_message("Script completed successfully.")  
  
    except Exception as e:  
        log_message(f"General error: {str(e)}")  
  
# Execute the script  
if __name__ == "__main__":  
    if not os.path.exists(base_output_dir):  
        os.makedirs(base_output_dir)  
    log_message("Backup process started.")  
    backup_feature_services()