Scripted Geodatabase Inventory

3943
1
02-08-2015 07:18 PM
TobiasFimpel1
Occasional Contributor III

Someone must have coded this before - so why not ask here if someone can share successes or failures: I'd love to have a python script that inventories an SDE geodatabase, as it is seen by the end user in the ArcCatalog Window (In other words, I'm not interested in all the tables that exist "under the hood", such as gdb.itmes etc.). The script would iterate over all objects, get the object name, attribute fields, domains, subtypes, etc. and output them to a .csv file or whatever. Should be technically doable. Has anyone attempted/accomplished this before? How far did you take it (i.e. did you get object properties like privileges, editor tracking, archiving, etc.)?

0 Kudos
1 Reply
TeroRönkkö
Occasional Contributor

I made script to fetch alias names, owners and domains to big excel sheet.

And you could do things like this:

datasets = arcpy.ListDatasets("", "Feature")

for dataset in datasets:

    #print('#' + dataset + ':')

    featureclasses = arcpy.ListFeatureClasses("","",dataset)

    for fc in featureclasses:

        tmp = fc.split('.')

        owner = tmp[0]

        if owner == "EXAMPLE":

            print "DO SOMETHING";

            continue;

# Go through non dataset features and tables:

featureclasses = arcpy.ListFeatureClasses()

for fc in featureclasses:

    ldataset.append(dataset)

#etc.