I'm trying to list all of my feature classes in a gdb which are all in Datasets. I've found that the following script works on small Datasets but it slows way down and crashes once it gets to a Dataset with 300-500+ feature classes. Generally I get the 999998 error message. My ram looks good even when it hits the larger datasets. Has anyone run into this yet or have any ideas on how to fix this?
>>> import arcpy sdepath = r"C:\Users\faulk\Documents\ArcGIS\Golf DB.gdb" arcpy.env.workspace = sdepath DataSetList = arcpy.ListDatasets("*") for DS in DataSetList: fullPath = sdepath + "\\" + DS arcpy.env.workspace = fullPath FCList2 = arcpy.ListFeatureClasses("*") for FC in FCList: desc = arcpy.Describe(FC) numberOfAttributes = arcpy.GetCount_management(FC) results = fullPath + "\\" + "," + FC + "," + desc.shapeType + "," + desc.spatialReference.name + "," + str(numberOfAttributes) print results+ "\n" del desc del numberOfAttributes del fullPath del FCList arcpy.ClearWorkspaceCache_management()