ArcGIS Pro/ArcMap .csv imports 8000-character String fields

3309
9
Jump to solution
09-13-2018 08:49 AM
JasonMenard1
New Contributor

Has anyone ever had issues importing a .csv as events into ArcMap (or Pro) and having it interpret String fields with a length of 8000?  This winds up creating viable feature classes with the same issue that break most geoprocessing tools because of the field lengths.  I'd love to know how to avoid this.

0 Kudos
2 Solutions

Accepted Solutions
TimOrmsby
Esri Regular Contributor

Hi Jason, this bug has been in the system a while. It's marked as medium severity. If it's breaking gp tools please contact tech support and reference bug number BUG-000086039 so the severity can be upgraded.

View solution in original post

KoryKramer
Esri Community Moderator

I haven't tried this, but you could create a schema.ini file that defines the exact widths you want to define for each field in the csv that you're importing, and make sure that schema.ini is in the same folder as the .csv.

Here's some doc: Adding an ASCII or text file table—Help | ArcGIS Desktop 

Here's a quick example I had sitting around in a folder:

Hope this helps!

View solution in original post

9 Replies
DanPatterson_Retired
MVP Emeritus

sample of a few rows?

sounds like you may be missing newline delimiters

0 Kudos
AdrianWelsh
MVP Honored Contributor

Jason,

I've had this happen a few times but have not found any issues with geoprocessing tools. Which tools are not working for you?

0 Kudos
TimOrmsby
Esri Regular Contributor

Hi Jason, this bug has been in the system a while. It's marked as medium severity. If it's breaking gp tools please contact tech support and reference bug number BUG-000086039 so the severity can be upgraded.

DanPatterson_Retired
MVP Emeritus

That bug has been around for a while.  I haven't experienced it with csv files using ArcGIS Pro.  I have created them using text editors and as the outputs from numpy and other programs.

How was your csv file created? From Excel or some other program?  

0 Kudos
AdrianWelsh
MVP Honored Contributor

Dan, I get the 8000 characters thing from CSVs and my CSVs are created from Excel. So, maybe Excel is the culprit here...

0 Kudos
DanPatterson_Retired
MVP Emeritus

excel seems to be the culprit. But I use excel or quattro pro to produce csvs all the time but they go to numpy structured arrays and I usually set the dtype myself.  I will be paying attention more to np.genfromtxt to see the dtypes produced. 

There is a related issue with file gdb's with TableToNumPy array.  So whatever is going on needs to be monitored if working in that environment.  PS, a schema is like setting a dtype for a structured/recarray

0 Kudos
KoryKramer
Esri Community Moderator

I haven't tried this, but you could create a schema.ini file that defines the exact widths you want to define for each field in the csv that you're importing, and make sure that schema.ini is in the same folder as the .csv.

Here's some doc: Adding an ASCII or text file table—Help | ArcGIS Desktop 

Here's a quick example I had sitting around in a folder:

Hope this helps!

AdrianWelsh
MVP Honored Contributor

Kory, that sounds like an awesome work around. If it worked well, it would be good to add that to the bug information.

0 Kudos
KarinBodtker
New Contributor

It works! I tried it today. Just make sure the schema.ini file is in the same folder as the csv or txt file you are importing/adding.