Select to view content in your preferred language

Automate Export of Hosted Feature Layer

115
6
Wednesday
Labels (3)
JonM32
by
Frequent Contributor

Good morning,

Is there a way to automate exporting a hosted feature layer in ArcGIS Online?

I've seen posts regarding this but they all have it setup to automatically export the feature layer to a local drive, which I don't want.

I would like to set up an automated process to export my feature layer and save a copy of the export in a folder within my ArcGIS Online account. It would be nice to set up the backup process to save backups for say a month, and then delete anything older than that. 

Is this possible?

Thanks,

Jon

Jon
0 Kudos
6 Replies
David_McRitchie
Esri Regular Contributor

Hey Jon you could probably do this with the ArcGIS API for Python.

In regards to uploading a copy of content to ArcGIS Online it depends what you would require. You could download a local copy and automate the uploading of this copy to a specific folder on ArcGIS Online, or alternatively run Clone_Items and direct the clone to the other folder.

For managing content you could also run a check for content that is older than a specific date, along with another filter to avoid other user content. Perhaps a specific tag or user who is responsible for all backups?

For getting started I would recommend checking the samples on the ArcGIS API for Python site, particularly the ones under the Org Administrators section.  The following post also has a code sample that could be used for identifying content that is not being commonly looked at.

Hope that helps give sme initial ideas,

David

Esri UK -Technical Support Analyst
0 Kudos
JonM32
by
Frequent Contributor

@David_McRitchie Thanks David, I'll take a look at this.

I've always done it manually from the feature layer's item page, which when  you export the layer it gets saved to AGOL automatically. Then you can download it if needed. Hopefully there's a way to do it without saving a local copy since I would like to have it run solely within AGOL.

Jon
0 Kudos
MobiusSnake
MVP Regular Contributor

I built a process nearly identical to what you described for a client.  I use an AGOL Notebook scheduled to run once a day, it does the following:

  • Read a hosted feature layer table that's used for configuration - it contains a list of Item IDs of the hosted feature layers to back up
  • For each item, called Item.export() - this will export each to an FGDB in an AGOL folder I have configured
  • After all the backups have been run, check the creation date on other FGDBs in that folder (use Content Manager's Search call) ... anything beyond a configured threshold (I use 14 days currently) gets deleted

Edit to add - Just remembered the export() call doesn't take an AGOL folder as a parameter, it automatically saves to your root folder.  You have to wait until after it completes running then use a move function to move it to the intended folder.

JonM32
by
Frequent Contributor

@MobiusSnake This is what I have so far.

The backup ran successfully in a notebook with the code in the first screenshot. It saved a copy of the layer to my root folder in AGOL.

What I'm stuck on now is how to connect the move function to move the export to a backup folder. I tried running the code in the second screenshot after the first code chunk, but I keep running into an error.

I appreciate the help!!

Jon

 

2025-06-05 11_44_51.png

2025-06-05 11_45_55.png

Jon
0 Kudos
David_McRitchie
Esri Regular Contributor

Hey Jon what are we using for the imports and authentication? 

from arcgis.gis import GIS
gis = GIS("home")

I presume you have the following given the initial gis.content.get worked. Maybe worth checking that the account being used within the script has access to the 2025_Backups folder?

 

David

 

Esri UK -Technical Support Analyst
JonM32
by
Frequent Contributor

@David_McRitchie I'm using what you have listed in your screenshot.

I had success! But now have another question related to the moving part.

I was able to get the move method to work as shown in my screenshot. The item ID was used to .get the item and them move it. The issue is when I run this on a schedule and produce multiple backups over say a 2 week period, I will end up with multiple items in my root folder with different item IDs. How can I set this code up to be able to .get the items if I don't know the item IDs ahead of time (since they are randomly generated with each backup)?

If that doesn't make sense, just let me know.

2025-06-05 14_01_57.png

Jon
0 Kudos