arcpy.ListFeatureClasses() Returning 'NoneType'

1465
7
08-04-2020 12:02 PM
LeviCecil
Occasional Contributor III

I have a geodatabase with thousands of grid features. I'm trying to dissolve and append these to a single dataset. I've tried iterating through the feature classes, but after spinning for a while the script keeps returning AttributeError: 'NoneType' object has no attribute 'group' when I try to list the feature classes. I have another nearly identical gdb that does not have this issue. Is there a way to figure out if there is a corrupted feature class, or if the gdb is corrupted? 

0 Kudos
7 Replies
DanPatterson
MVP Esteemed Contributor

grid features are rasters, 

perhaps

ListRasters—ArcGIS Pro | Documentation 


... sort of retired...
0 Kudos
LeviCecil
Occasional Contributor III

Sorry I didn't mention that these are polygon grids created with the generate tessellations tool.  

0 Kudos
DanPatterson
MVP Esteemed Contributor

Assuming:

locally stored data, nothing is being edited, you have rebooted and the observation persists

Then try:

That leave the featureclass names and the geodatabase name.

You can also try using FeatureclassToFeatureclass to copy one of the fcs to a different gdb using the same name to see if it is the fc or the original gdb that is the issue.


... sort of retired...
0 Kudos
LeviCecil
Occasional Contributor III

I was able to copy over several dozen without an issue. I tried compacting and compressing the gdb, but that didn't help. I'll try rebooting. The gdb is named "Default", because I was using the project default for temporary storage. 

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

It would be helpful if you showed how you are setting up your workspace and calling ListFeatureClasses.  Are the feature classes in a feature data set or the root level of the GDB?

0 Kudos
LeviCecil
Occasional Contributor III

They're at the root level. So it's working now. I changed the name of the Default gdb to "grids_org." Not sure if that was the issue. Here is my script:

import os, datetime, re
import sys
import arcpy
from arcpy import env
from datetime import date, timedelta

date_obj = date.today()
date_string = date_obj.strftime("%Y%m%d")

memory_hole = r'memory\\'

output_gdb = r"C:\\Temp\\ReEntry_Capacity\\merged_grids.gdb"
input_gdb = r"C:\\Temp\\ReEntry_Capacity\\grids_org.gdb"

arcpy.env.workspace = input_gdb
arcpy.env.overwriteOutput = True
featureclasses = arcpy.ListFeatureClasses()

for fc in featureclasses:

    start_time = time.time()

    capacity = arcpy.GetCount_management(fc)
    room = re.search(r"(?<=_room_).*?(?=_grid)",fc, flags=0)
    
    site = re.search(r".+?(?=_room_)",fc, flags=0) ##"([^_room_]+)*"
    roomNum = room.group(0)
    siteName = site.group(0)
    grid_buffer_dissolve = memory_hole + siteName + "_room_" + roomNum + "_buffdiss"

    print(siteName + " room " + roomNum)
    
    arcpy.analysis.Buffer(fc, grid_buffer_dissolve, "-0.01 Feet", "", "", "ALL", None, "PLANAR")
    
    arcpy.AddField_management(grid_buffer_dissolve, "Site", "TEXT")
    arcpy.CalculateField_management(grid_buffer_dissolve, "Site", '"' + siteName + '"')
    arcpy.AddField_management(grid_buffer_dissolve, "RoomNumber", "TEXT")
    arcpy.CalculateField_management(grid_buffer_dissolve, "RoomNumber", '"' + roomNum + '"')
    arcpy.AddField_management(grid_buffer_dissolve, "Count", "SHORT")
    arcpy.CalculateField_management(grid_buffer_dissolve, "Count", capacity)
    

    arcpy.Append_management(grid_buffer_dissolve,output_gdb + '\\capacity_grids',"NO_TEST")

    elapsed_time_secs = time.time() - start_time

    msg = "Execution time: %s" % timedelta(seconds=round(elapsed_time_secs))
    print(msg)
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
LeviCecil
Occasional Contributor III

The problem I'm having now is that the script is skipping over the first couple of dozen feature classes and I'm not sure why. I'm able to load these features into a map with no problems. There is no difference in the naming conventions. The features being skipped are like "ABERNETHY_room_1_grid", and the first ones the script iterates through are like "AINSWORTH_room_1_grid." Some of the ones being skipped were copied from another geodatabase, but there is nothing outwardly wrong with them. Is there a method to examine features in a geodatabase for errors?

0 Kudos