Select to view content in your preferred language

CSV-import: add "Row To Use As Field Names"

2509
4
02-07-2023 02:45 AM
Status: Open
Labels (1)
nadja
by
Frequent Contributor

Currently a csv-table is just added to the Table of Content (when using drag & drop or when using "Add Data"). The first row is used as Field Names if the conent is correctly formatted (e.g. starting with letters). However, if the data is not formatted according to the esri-standards, e.g. because the header contains just numbers (different years), the header is treated as an ordinary row of data, which is incorrect. If the data is received from a third party, the esri-standards cannot always be expected and if the data contains many attributes, the reformatting can take a while. Thus we propose the following:

To keep the drag&drop option efficient, we propose to keep that tool without any import-wizard. --> as it is.

To add flexibility in the data import, we propose to add a tool "CSV to Table", which is similar to the tool "Excel to Table", which provides the possibility to specify a row to use as field names. Both "Excel to Table" and "CSV to Table" should further offer the option to create a temporary table, meaning that no "Output Table" has to be specified. The option "Output Table" will be changed to a optional field.

 

4 Comments
RoseF
by

I would use a CSV to Table conversion several times a week; as it is, I open the CSV is Excel and save it as an Excel file so I can use the Excel to Table tool.

Personally, I prefer the output table because I usually do a lot of work in the tables after conversion/creation. I don't trust a "temporary" table's integrity when I'm doing a lot of processing -- I don't even use CSVs in Esri products because I've had them fail in the past. So for me, making the output optional would work fine but would be unnecessary.

RandyCasey

I agree that using a CSV file "as-is" is a leap of faith in most circumstances, because a lot of the time it fails to import correctly, especially when importing a table that has values like Assessor Parcel Numbers that can and will start with leading zeros, only to have those values changed to numbers, removing the leading zeros and effectively making them useless. Having some control over how the table is read in would make seemingly simple tasks, like a quick relate to get a selection set for example, less frustrating; but you wind up having to convert the table somewhere else and import it in, taking more time and effort than it really should.

BarryNorthey

Have you explored using a schema.ini file. A quick test showed that the following lines inside the schema.ini file sets the first row of file numField.csv as the column name headers and accepts field names starting with numbers. You can also force/override column formats, etc. The schema.ini file and csv file must reside in the same folder.

[numField.csv]
ColNameHeader=True

BarryNorthey_0-1675805353658.png

Otherwise

BarryNorthey_1-1675805507126.png

BarryNorthey_2-1675805705638.png

https://pro.arcgis.com/en/pro-app/latest/help/data/tables/add-an-ascii-or-text-file-table.htm

 

RandyCasey

@BarryNorthey I am familiar with modifying the schema.ini file, and while that is a valid workaround and presently users should take note of it, but I don't think a workaround is what is being asked for here. At least for me, I would like to see ArcGIS Pro come to the standard of importing text files that other applications have been doing for years and not have its users fiddle with or manipulate ancillary files just to have it look and act the way they want it.