Select to view content in your preferred language

AGOL Notebook - item.export and the 'overwrite' argument

2566
8
Jump to solution
10-29-2021 08:45 AM
AaronKoelker
Frequent Contributor

I'm writing a notebook to export a hosted feature layer that I own as a File Geodatabase item into my content, using item.export . This would be used as a backup copy of my data, if needed. I then intend to schedule a task for the notebook to have it run on its own periodically -- and ideally would like it to overwrite the previously-created backup each time so that I don't have a bunch of items creating clutter and taking up storage space.

The item.export method has an argument for 'overwrite' to replace an existing item, which sounds like what I need, however I'm either misunderstanding it or it doesn't seem to be working correctly. When I set overwrite = true, it just creates a new item in my content with the same exact name.

 

 

dataitem = gis.content.get(#itemid)
backupname = f'{dataitem.title} Backup'
dataitem.export(backupname, 'File Geodatabase', parameters=None, wait=True, overwrite=True)

 

 

Any idea if this is a bug or intended behavior? Is there a better way to do this? I tried using a search to locate the previous backup so I could just delete it -- but even when feeding in the exact item name, it would occasionally pull up some other item in the org.

-Aaron
Tags (3)
0 Kudos
1 Solution

Accepted Solutions
by Anonymous User
Not applicable

@AaronKoelkerwe decided to remove this parameter in v2.0 of the API (future release). The reason being, server does not establish a comprehensive item2item relationship for this use case, without which the API does not know which item to overwrite. Besides, we have a cardinality issue where a source item gets exported multiple times and the API would not know which of those items to overwrite.

The alternate is for you (as the user) to determine which item needs to be updated and call the https://developers.arcgis.com/python/api-reference/arcgis.gis.toc.html#arcgis.gis.Item.update (DestinationItem.update()) method on it and pass the file that was returned from the SourceItem.export() call.

View solution in original post

8 Replies
by Anonymous User
Not applicable

@AaronKoelkerthanks for reporting. This could be a bug on the Python API. The server does not actually support overwriting in this operation. I have logged a bug and we will investigate.

AaronKoelker
Frequent Contributor

Bummer, but thanks for the info.

If anyone has any ideas for how to achieve this through another method, I'd love to hear it. Looking into using item.update now.

-Aaron
0 Kudos
by Anonymous User
Not applicable

@AaronKoelkerwe decided to remove this parameter in v2.0 of the API (future release). The reason being, server does not establish a comprehensive item2item relationship for this use case, without which the API does not know which item to overwrite. Besides, we have a cardinality issue where a source item gets exported multiple times and the API would not know which of those items to overwrite.

The alternate is for you (as the user) to determine which item needs to be updated and call the https://developers.arcgis.com/python/api-reference/arcgis.gis.toc.html#arcgis.gis.Item.update (DestinationItem.update()) method on it and pass the file that was returned from the SourceItem.export() call.

AaronKoelker
Frequent Contributor

Thanks for the update! I was able to get the item.update method to work for my purposes. Appreciate you taking a look.

-Aaron
0 Kudos
heatherdaw
Emerging Contributor

Hi Aaron. I'm trying to do the same thing, but I haven't figured it out. Would you mind sharing your script? Much appreciated if you would.

0 Kudos
AaronKoelker
Frequent Contributor

@heatherdaw  this is the full script I'm using to create a backup fgdb of a hosted feature layer in my content. It overwrites the existing backup each time. Hope that helps.

 

 

from arcgis.gis import GIS
from time import strftime
gis = GIS("home")
username = gis.properties.user.username
home_dir = os.path.abspath(os.path.join(os.sep, 'arcgis', 'home'))

# Variables
itemid = "000abc000abc000abc000abc000abc00" # AGO layer to backup
itemtype = "File Geodatabase" # export format
itemtags = "backup, fgdb" # item tags
itemdesc = strftime("This backup was generated on %m/%d/%Y.") # item description
folderlocation = "/" # export folder, / is root folder

# Check to see if a backup file geodatabase already exists
dataitem = gis.content.get(itemid) #fetch the item to be backed-up
backupname = f"{dataitem.title} Backup" #acquire backup name using the source hosted feature layer
searchquery = f"title:{backupname} AND owner:{username}" # search query using item title and owner
searchresult = gis.content.search(query=searchquery, item_type=itemtype, sort_field='uploaded', sort_order='desc') #search for existing backups

print(f'Found {len(searchresult)} existing backups')
# Update existing backup item, if it exists
if len(searchresult) >= 1:
    print('Updating existing backup file...')
    
    dataitem = gis.content.get(itemid) #fetch the item to be backed-up
    backupname = f'{dataitem.title} Backup' #create a name for the backup using the existing item name
    dataitem.export(backupname, itemtype, parameters=None, wait=True, tags=itemtags, snippet = itemdesc, overwrite=True) #export the data to a file geodatabase within the user's content
    
    searchquery = f"title:{backupname} AND owner:{username}" # search query using item title and owner
    searchresult = gis.content.search(query=searchquery, item_type=itemtype, sort_field='uploaded', sort_order='desc') #find the new temporary backup item that was just created
    backupitem = gis.content.get(searchresult[0].itemid)
    dwnldname = backupname.replace(" ","_")
    backupitem.download(save_path=home_dir, file_name=dwnldname + '.zip')
    
    searchquery = f"title:{backupname} AND owner:{username}" # search query using item title and owner
    searchresult = gis.content.search(query=searchquery, item_type=itemtype, sort_field='uploaded', sort_order='asc') #search for the previous backup
    oldbackup = gis.content.get(searchresult[0].itemid)
    itemprops = {"snippet":f"{itemdesc}","description":f"{itemdesc}"}
    oldbackup.update(item_properties=itemprops, data = os.path.join(home_dir, dwnldname, '.zip'))

    backupitem.delete(dry_run=False) #Can change dry_run to True for testing, won't remove duplicate backups
    
#Create a new backup if no previous backup exists   
else:
    print('Creating new backup...')
    
    dataitem = gis.content.get(itemid) #fetch the item to be backed-up
    backupname = f'{dataitem.title} Backup' #create a name for the backup using the existing item name
    dataitem.export(backupname, itemtype, parameters=None, wait=True, tags=itemtags, snippet = itemdesc) #export the data to a file geodatabase within the user's content
    
    ## Move the backup to designated folder, if not root
    searchquery = f"title:{backupname} AND owner:{username}" # search query using item title and owner
    searchresult = gis.content.search(backupname, item_type=itemtype) #find the backup that was just created
    backupitem = gis.content.get(searchresult[0].itemid) #get the itemid of that new item
    if folderlocation != "/":
        print('Moving backup to designated folder...')
        backupitem.move(folder=folderlocation) #move the item to the correct user folder, if user chose somewhere other than the default root

 

 

-Aaron
heatherdaw
Emerging Contributor

Thanks so much for the swift reply! I'll take a look at this.

0 Kudos
AaronKoelker
Frequent Contributor

@heatherdawsorry I forgot the top bit of the script which imports strftime and defines the username and home directory. Edited the original post above to include those at the top.

-Aaron
0 Kudos