I can find feature class limits for a file geodatabase feature class here ArcGIS Desktop does any know where to find the information\limits for an in_memory feature class. I'm currently looking to find the maximum text field size, I would also like an overview of the limits of the in_memory feature class.
Solved! Go to Solution.
It appears as though the only limitation is you computer's memory. Interestingly, the in_memory feature class appears to ignore the defined length for fields. I the following script I created set a field length to 5, the successfully appends a record with a field length of one million. When I updated the record to 1 billion, i got a memory error.
import arcpy testFC = arcpy.CreateFeatureclass_management(r'in_memory','testFC', 'POLYGON', "", "", "", arcpy.SpatialReference(32100)) arcpy.AddField_management(testFC,'COMMENT',"TEXT","","",5) ic = arcpy.da.InsertCursor(testFC,('COMMENT')) ic.insertRow(('test1',)) ic.insertRow(('t'*1000000,)) del ic with arcpy.da.SearchCursor(testFC,('COMMENT')) as c: for row in c: print len(row[0])
It appears as though the only limitation is you computer's memory. Interestingly, the in_memory feature class appears to ignore the defined length for fields. I the following script I created set a field length to 5, the successfully appends a record with a field length of one million. When I updated the record to 1 billion, i got a memory error.
import arcpy testFC = arcpy.CreateFeatureclass_management(r'in_memory','testFC', 'POLYGON', "", "", "", arcpy.SpatialReference(32100)) arcpy.AddField_management(testFC,'COMMENT',"TEXT","","",5) ic = arcpy.da.InsertCursor(testFC,('COMMENT')) ic.insertRow(('test1',)) ic.insertRow(('t'*1000000,)) del ic with arcpy.da.SearchCursor(testFC,('COMMENT')) as c: for row in c: print len(row[0])
do you mean the width? probably the same as a gdb, but give it whirl Bill Daigle did for the record length
My experience has been that the in_memory gdb is handled exactly like any other gdb. However, if you are concerned about size limits, you probably should stay on disk (file gdb). The amount available depends on what's loaded on your machine at that particular moment.
Because of this uncertainty, it's really best to reserve in_memory for situations where you are sure the features or tables are fairly limited in size. If you load up the in_memory workspace with huge data, you risk crashing arcpy, or, almost worse, locking up your machine by thrashing RAM with virtual memory. No one likes unstable script tools!