I've been tasked with providing a csv of all my file geo dbs and their file sizes. Suggestions, please?
Are you only looking at file geodatabases? Are they all in the same folder? Please describe your situation with more detail.
In this situation, arcpy.da.Walk doesn't get you anything over os.walk because arcpy.da.Walk is focused on the contents of geospatial data stores (feature classes, raster catalogs, tables, etc...) and not the data stores themselves (file geodatabases, folders, enterprise geodatabases, etc...). arcpy.da.Walk can do the job, it just does it slower because of the extra overhead from enumerating geospatial data within the data stores.
Since you primarily seem interested in only the size of the file geodatabases and not the specifics of what is inside them, I recommend using straight os.walk to find the file geodatabases and determine their sizes.
import os def get_size(start_path = '.'): total_size = 0 for dirpath, dirnames, filenames in os.walk(start_path): for f in filenames: fp = os.path.join(dirpath, f) total_size += os.path.getsize(fp) return total_size output = #output file path = #root/parent directory to start recursive search with open(output, 'w') as f: for dirpath, dirnames, filenames in os.walk(path): for dirname in dirnames: if dirname.endswith('.gdb'): gdb = os.path.join(dirpath, dirname) size = get_size(gdb) f.write("{},{}\n".format(gdb, size))
The get_size function code came from the accepted answer by monkut to the following Stackoverflow post: Calculating a directory size using Python?
Super helpful, thank you VERY much!
No problem. Whether my response or someone else's, please close out your questions by marking a "Correct" answer when you no longer have an issue.