IDEA
|
This is my solution for backing up AGOL data myself: https://community.esri.com/t5/arcgis-online-questions/how-to-create-hosted-feature-layer-backups/m-p/1419497#M59065
... View more
a month ago
|
0
|
0
|
170
|
IDEA
|
Especially relevant for the Utility Network. Our Pipeline Device layer has 184 fields, most of them irrelevant.
... View more
09-04-2025
05:39 AM
|
0
|
0
|
138
|
POST
|
Does it work if you change the line from: "S:\e\e65-1.tif" to: r"S:\e\e65-1.tif"
... View more
01-23-2025
05:27 AM
|
0
|
0
|
357
|
POST
|
Can you set an Except statement to print the URL that cannot be reached so the script will continue instead of stop?
... View more
01-03-2025
08:35 PM
|
0
|
1
|
2100
|
POST
|
Hi Hanlie, As far as restoring an item from JSON if you accidentally deleted it or if I just want to compare the current vs backed-up item, I create a brand new webmap or dashboard manually in Chrome and copy/pasted the JSON from the backup and overwritten the existing JSON using https://ago-assistant.esri.com/. Problem with that is you'd get a new item ID which could break other things like scripts or links you'd already established. I've used the script mostly to: See what items in my ORG were changed when and have a log each day to refer back to or to restore back to If an item really got screwed up, I would go into AGO Assistant and replace the json with the backup json and see if that fixed the item. I go in AGO Assistant. I search for the item (by default it only display items you own) by name or itemID. Find the item and in the dropdown choose I want to...View an Item's JSON Replace the Data of it's json with the .json file you have backed up on your network. Hit save and then view that item in Chrome or Field Maps.
... View more
11-18-2024
10:14 AM
|
0
|
1
|
1007
|
POST
|
Yes, @MappyIan -- that solved my problem. I wanted to share what I ended up doing. I'm downloading AGOL hosted feature layers to GDB on a nightly basis (via Python and Task Scheduler). Every time I do this, it creates a File GDB in AGOL that I used to delete right after it's creation. Since July 2024, it would now go to the Recycle Bin for 14 days (which consumes credits as these are >1 GB files). Instead of deleting the File GDB, I modified my script to add a tag to each item: 'ToDelete' Then in AGOL Notebooks (using Runtime 10), I schedule a script daily to delete content with that tag. from arcgis.gis import GIS
gis = GIS("home")
def find_items_with_tag(tag):
# Search for items with the specified tag
#items = gis.content.search(query="tags: = 'ToDelete'", max_items=100 )
items = gis.content.search(query="tags: = 'ToDelete'",item_type="File Geodatabase", max_items=300 )
# Extract and return the global IDs of the found items
global_ids = [item.id for item in items]
return global_ids
if __name__ == "__main__":
tag_to_search = "ToDelete"
item_ids_list = find_items_with_tag(tag_to_search)
print("Global IDs of items with tag '{}':".format(tag_to_search))
for item_id in item_ids_list:
print(item_id )
itemToDelete = gis.content.get(item_id)
itemToDelete.delete(permanent=True)
... View more
10-03-2024
08:43 AM
|
0
|
0
|
2236
|
POST
|
This is what I do: https://community.esri.com/t5/arcgis-online-questions/how-to-create-hosted-feature-layer-backups/m-p/1419453
... View more
08-28-2024
09:37 AM
|
0
|
0
|
840
|
POST
|
@JustinColville Has anyone successfully been able to permanently delete an AGOL item via Python AFTER ESRI added the recycle bin? from arcgis.gis import GIS
gis = GIS("home")
item_id='767ff8fe699047d7bc0e74c0f15a63a4' #this is the service_ID of the hosted feature layer to delete
itemToDelete = gis.content.get(item_id)
itemToDelete.delete(permanent=True) Running this code in AGOL Notebooks as well as Python 3.9.16 on my computer with Pro 3.1.2 I get the following error message: TypeError Traceback (most recent call last)
/tmp/ipykernel_19/3146111176.py in <cell line: 7>()
5
6 itemToDelete = gis.content.get(item_id)
----> 7 itemToDelete.delete(permanent=True)
TypeError: delete() got an unexpected keyword argument 'permanent'
... View more
07-18-2024
09:26 AM
|
0
|
1
|
3075
|
POST
|
A couple things come to mind… When testing, I would add to your script print(type(Initial_Date)) That way you can tell what date type it is and research how to do a time delta to that format (or if you need to convert it to a different datetime format) I have seen issues where a date query doesn’t work unless I give it a range of dates. Date > 6/6/2024 In some cases that does not work Date > 6/6/2024 and Date<1/1/2099 Works when the first one doesn’t
... View more
06-07-2024
09:24 AM
|
0
|
0
|
1744
|
POST
|
JSON backups Similarly, I also backup the content JSON files incase schema on a hosted layer gets changed or a webmap/dashboard/app gets inadvertently changed or broken. ESRI webmaps, dashboards, etc. have two parts to their JSON data. Description Data The Data is where all the customizations you’ve made to the content are stored. The script is set to download all the following JSON Data for the following item types in your Portal: Web Map Web Mapping Application Feature Layer Application Dashboard Web Experience item_types If a new item of that type is added to Portal, a new folder is created from the item name. If the content was modified in the last 24 hours, a new JSON file is saved. import arcgis
from arcgis import gis
from arcgis.gis import GIS
import json
import os
import glob
gis = arcgis.gis.GIS("https://yourorg.maps.arcgis.com", "username", "password")
import datetime
now = datetime.datetime.now()
dateString = now.strftime("%Y-%m-%d")
folder_location = r"C:\ArcGISOnline\BackupFiles\JSON"
item_types = { 'Web Map', 'Web Mapping Application', 'Feature Layer','Application', 'Dashboard', 'Web Experience'}
for itemType in item_types:
folder_name = itemType
subfolder_path = os.path.join(folder_location,folder_name)
try:
os.mkdir(subfolder_path)
except:
pass
print("\n"+itemType)
itemType_items = gis.content.search(query="*",item_type=itemType,max_items=10000)
listOfItemIDs = []
for item in itemType_items:
listOfItemIDs.append(item.id)
for iid in listOfItemIDs:
my_item = gis.content.get(iid)
item_title = my_item.title
removestring ="%:/,.\\[]<>*?$"
item_title = ''.join([c for c in item_title if c not in removestring]).strip()
file_name = item_title+"_"+itemType+"_"+str(dateString)+".json"
file_path_item_folder = os.path.join(subfolder_path ,item_title)
file_path = os.path.join(file_path_item_folder ,file_name)
item = my_item.get_data(try_json=True)
if len(item)>0: # only proceed if json dump is not empty
try:
#print(file_path_item_folder)
os.mkdir(file_path_item_folder)
except:
a=1
full_path = os.path.join(file_path_item_folder,file_name)
#get date of newest file in folder
list_of_files = glob.glob(file_path_item_folder+"\*")
if len(list_of_files)>0:
latest_file = max(list_of_files, key=os.path.getctime)
newestFileDate = os.path.getmtime(latest_file )
else:
newestFileDate = 0
#note: does not run on empty folders with no json exports already
if newestFileDate == 0 or my_item.modified/1000>newestFileDate : # if modified date newer than last json or no json in folder
print("\t"+item_title)
print(full_path)
with open (full_path, "w") as file_handle:
file_handle.write(json.dumps(item))
print("\n\nCompleted")
... View more
05-06-2024
09:48 AM
|
2
|
0
|
1463
|
POST
|
Geodatabase Backups I recommend adding a Tag to any content you want to backup. Then via Python script, you can download any content with that Tag to gdb and schedule the script (via Task Scheduler) to run daily or weekly as you need. I use the tag: GDBNightlyBackup import datetime
startTime = datetime.datetime.now()
TodaysDate = datetime.date.today().isoformat()
print (startTime)
##############
import arcgis
from arcgis.gis import GIS
import os
#enter AGOL sign-in credentials
gis = GIS("https://yourorg.maps.arcgis.com", "username", "password",verify_cert=False)
now = datetime.datetime.now()
folderName = now.strftime("%Y-%m-%d")
#print (folderName)
parent_dir = r"C:\ArcGISOnline\BackupFiles\FeatureLayers"
path = os.path.join(parent_dir, folderName)
#create folder if it doesn't exist
if os.path.isdir(path):
pass
else:
os.mkdir(path)
print("Directory '% s' created" % folderName)
def downloadItems(downloadFormat):
try:
download_items = [i for i in gis.content.search(query="tags: = 'GDBNightlyBackup'", item_type='Feature Layer',
max_items=-1)]
#print(download_items)
# Loop through each item and if equal to Feature service then download it
for item in download_items:
if item.type == 'Feature Service':
print(item)
result = item.export(f'{item.title}', downloadFormat)
#r'file path of where to store the download'
result.download(path)
# Delete the item after it downloads to save on space
result.delete()
except Exception as e:
print(e)
downloadItems(downloadFormat='File Geodatabase')
#############
endTime = datetime.datetime.now()
td = endTime - startTime
hours, remainder = divmod(td.seconds, 3600)
minutes, seconds = divmod(remainder, 60)
TimeElapsed='{:02}:{:02}:{:02}'.format(int(hours), int(minutes), int(seconds))
print ("")
print ("Done!")
print ("")
print ("Ended at " + str(endTime))
print ("Time elapsed " + str(td)) The result is I have backup folders every day for all the hosted content I've been downloading.
... View more
05-06-2024
09:44 AM
|
3
|
0
|
1463
|
POST
|
Awesome 🎉 Don't get too frustrated as it gets easier the more you use Arcpy. The more code you write, the more you can "steal" ideas from your other scripts, or ESRI samples or GIS Stack Exchange, etc. Even ChatGPT sometimes can produce usable Esri Python code!
... View more
04-25-2024
12:31 PM
|
1
|
0
|
389
|
POST
|
Python IDLE is probably putting a lock on those mxds. If you close the Python results Window, you should be able to edit in ArcMap.
... View more
04-25-2024
12:15 PM
|
1
|
0
|
1402
|
POST
|
@AlfredBaldenweck is correct. If you're ever curious what a variable is you can type: print(type(mxd)) Also, the df part need to be tabbed over: from arcpy import env
# Set the workspace for the folder containing MXDs
arcpy.env.workspace = r"S:\Workgroups\Test"
mxdList = arcpy.ListFiles("*.mxd")
# Iterate through all MXDs in the workspace
for mxd in mxdList:
mxdpath = env.workspace + "\\" + mxd
print (mxd + "Is being processed")
mxdNext = arcpy.mapping.MapDocument(mxdpath)
for df in arcpy.mapping.ListDataFrames(mxdNext,"Layers"):
for lyr in arcpy.mapping.ListLayers(mxd, "Backup", df):
if lyr == "Backup":
lyr.replaceDataSource(r"S:\Workgroups", "SHAPEFILE_WORKSPACE", "Backup.shp",
r"S:\Workgroups\Data.gdb\Final", "FILEGDB_WROKSPACE")
if lyr.name == "Backup":
lyr.name = "Final"
print("lyr.name")
mxdNext.save()
arcpy.RefreshTOC() Otherwise it's only going to run on the last MXD in the folder, not all the MXDs.
... View more
04-25-2024
12:06 PM
|
1
|
0
|
1409
|
POST
|
After making changes to your mxd via Python, you need to save those changes with: mxd.save()
... View more
04-25-2024
11:42 AM
|
3
|
2
|
1433
|
Title | Kudos | Posted |
---|---|---|
2 | 05-06-2024 09:48 AM | |
3 | 05-06-2024 09:44 AM | |
1 | 04-25-2024 12:31 PM | |
1 | 04-25-2024 12:15 PM | |
1 | 04-25-2024 12:06 PM |
Online Status |
Online
|
Date Last Visited |
5 hours ago
|