Sorry for the late response - my email filter sent the reply notices to my spam folder. Go figure.There are a substantial amount of records for some tables....3-4 million in a few cases.A test with a 2,130,000 records took 19.07 seconds using the following code:from arcpy import *
from stopwatch import clockit
@clockit
def maketv_count(table):
arcpy.MakeTableView_management(table,"mytv5k","NEAR_DIST < 1000")
arcpy.GetCount_management("mytv5k")
using a search cursor and iterating through took 19.43 seconds using this code:def maketv_cursor(table):
rows = arcpy.SearchCursor(table,"NEAR_DIST < 1000")
count = 0
for row in rows:
count += 1
Granted, 2 million records is a LOT...but I didn't expect the methods to be nearly identical in their speed.Thanksmike