Does anyone know what record count would be too big for mean center to work? Or any other way around these errors. My file sets are not going to be smaller on future work, I would love to know how, where to fix this.
I keep having it fail, and I'd rather not spend all day looking for its optimum size by trial and error. Right now I know its less than 700,000 records.
Original error message was: ValueError: Array is too big.
New error message after reducing the file size is: MemoryError
I haven't tested but have you tried numpy it works great with arrays
import numpy as np
field_names = 'your field names
newarr = arcpy.da.FeatureClassToNumPyArray(originFC,(field_names),null_value=-9999)
Use featureclasstonumpyarray to get the coordinates out to an array
a million points could of microseconds.
>>> import numpy as np
>>> a = np.random.randint(300000,400000,2000000)
array([ 349993.646074, 349992.998863])