Is there an easier way to find out the size of data stored in ArcGIS Online with out having to open each entry separately? We are trying to clean up some of our old entries that are eating up a lot of our credits but opening over 300 different entries is taking a long time. Then when you click the back button that does not help because you have to go figure out where you left off.
It would be nice if the size was listed in the contents view along with the name, type, shared and date modified info.
I don't know if there's a GUI-based method for pulling the size information from multiple items, but you can do this using the Python API. The pseudo code below will output a CSV listing every Hosted Service, its itemId and the size of the item in bytes.
from arcgis.gis import *
import csv
gis = GIS('Portal URL', 'admin', 'password')
hosted_layers = gis.content.search('typekeywords: "Hosted Service"', max_items=1000)
print('Hosted Service Count: ' + str(len(hosted_layers)))
fields = ['id', 'title', 'size']
with open(r'file location', 'w', newline='') as outfile:
csvfile = csv.writer(outfile)
csvfile.writerow(fields)
for layer in hosted_layers:
row = [layer.id, layer.title, layer.size]
csvfile.writerow(row)
print(outfile.name)
Hi Stephen - I believe you can do what you're looking for. Logged in as an administrator, click on 'Organization' --> View Status --> Scroll down to Aggregation by Type and select Storage --> then choose a sub type --> Table appears with Size and Credits --> Choose an item in the table --> View item details --> Opens a window of Details which can be downloaded.
Adam Z
Check out this blog that outlines how to download the feature storage report:
Understanding Feature Storage Reports (December 2016) | ArcGIS Blog
-Kelly
This is exactly what I was looking for, thank you Adam. Perfect way to look for bloated layers.