Select to view content in your preferred language

Max. text length for InsertCursor field

2245
16
07-17-2017 12:52 AM
MarcWouters
Emerging Contributor

I am importing X,Y values from file into a shape, but I need to keep the labels of each point as well in my application.

By lack of a way to keep a list of texts, I create a long string, separated by tabs, to keep the labels and the shape in the same feature in the resulting table.

I use the following code

cursor = arcpy.da.InsertCursor("Pipes", ["Name", "SHAPE@", "PointLabels"])
cursor.insertRow([fileName, polyline, pointLabels])

For some datasets, I get this error

The row contains a bad value. [PointLabels]

I assume that this is caused by exceeding some size limit, but I cannot determine easily which size this could be. (number of points and length of text labels vary quite a lot).

Any idea if there is a limit, and if so, what is this limit ?

0 Kudos
16 Replies
MarcWouters
Emerging Contributor

All data MUST be stored in polygons, because we can have several thousands polygons, with several thousands of vertices, and the visualisation would simply be much too slow if we would use one entry per vertex (points table)

Your proposal of having a separate point feature class (in fact duplicating all the data) might work, if we split the visualisation (polygons) and the administration part (points). But it requires a lot of extra linking, I guess

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

Ignoring the data design issues for the time being, which by the way I agree with the other commenters, what is the current field length of the text field you are trying to insert into?  I just inserted a 400,000 character string into a file geodatabase text field that was set to 500,000 length, but I get a bad value error if I try to insert a 550,000 character string.  In short, make sure your field is specified to handle strings as long as you are trying to insert.

0 Kudos
MarcWouters
Emerging Contributor

Joshua,

I had just discovered that this was the cause of my original problem.

Thanks for answering to the first question !!

0 Kudos
MarcWouters
Emerging Contributor

At this moment I'm not going to change the design of the tables, because the C++ application is already using this format, changing everything takes too much time.

So I set my text length to 400k to be on the save side. But many of these text fields are much shorter. This is a huge waste of memory space. Is there a way to set this length dynamically, to store short AND long text strings ?

0 Kudos
XanderBakker
Esri Esteemed Contributor

I guess that if you don't want to change the design of the tables and adapt the code to work with the relationship classes, you will have to accept that there will be memory waste...

Edit: CC vangelo-esristaff who I'm sure can provide some useful insights,,,,

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

The file geodatabase text field is more like a varchar than char, so you don't really waste any disk space if the max length is set to 100 or 500,000.  It is the contents of the field that drive most of the space on disk.  That said, there may be performance issues tied to storing lots of short character strings in a field with a max length of 500,000; I don't know the FGDB innards enough to say for sure on performance.

0 Kudos
DanPatterson_Retired
MVP Emeritus

You can't even modify field length in ArcGIS PRO... unless the table is empty  Modify Fields

0 Kudos