Select to view content in your preferred language

Wrong field types in hosted table

949
4
01-19-2024 12:56 AM
Labels (2)
MicBry
by
New Contributor II

Hi!

I am experiencing trouble when I uploading a csv to ArcGIS Enterprise Portal as a hosted table. When uploading the table I am making sure to map the fields to the appropriate field type. My table contains text and numbers (separated using different fields). I am mapping the fields contaning numbers to the field type "double", but when the data is uploaded I have noticed that all fields are changed to string. 

I have not had this issue before. I have tried to upload new data as well as a copy of my previous data (which I could map up successfully before), but now niether option is working. I get the same result each time I try. Since portal thinks of my numbers as texts I am having trouble using the data as intended. 

Does anyone recognize this problem? I am trying to figure out if the issue lies within my data or portal. My organization swtiched to enterprise 11.1 in December 2023. Might that have something to do with it?

 

Thanks!

0 Kudos
4 Replies
ChristopherCounsell
MVP Regular Contributor

If a field for the designated type contains an unsupported or invalid format, the field will be created as string data type in the resulting hosted feature layer even if you change the field type before publishing.

I would hazard a guess that there are values in your CSV that are not in the correct format and this is why it gets auto-converted to string. You can check the cell alignment of your CSV in Excel.

Source - I helped a lot of users with this issue and largely contributed to this article:

https://support.esri.com/en-us/knowledge-base/problem-generic-errors-from-uploading-and-published-cs...

If you are extremely confident that it is NOT the CSV file (try testing on a smaller dataset) then we can consider other issues.

MicBry
by
New Contributor II

Thank you for a quick and informative answer!

I did not notice anything wrong with my table - however, I tried to download it as excel instead of csv. When I uploaded this file (excel) everything worked. I did not change the content of the data, only the format. 

It seems that the problem occured because of the csv format. I have created all table data as a google form, so it is possible that the problem is connected to google, rather than ArcGIS Enterprise Portal. This has all worked for me before, so I cannot say for sure why it did not work for me this time around.

 

ChristopherCounsell
MVP Regular Contributor

There could be a few reasons. CSVs can get weird if there's unusual characters particularly in the heading. Or the columns don't align to the values.

Personally I'd recommend putting tables into a database prior to upload i.e. import into a file geodatabae with ArcGIS Pro. If you want to work with CSV/Excel and aren't confident in the source, just take a browse through the data. Some of the checks and hints in the article outline some of the more common reasons.

If this was helpful and there's no further questions please mark my previous answer as the solution to help others and give me sweet internet points.

0 Kudos
JCable
by
New Contributor II

I also recently came across the same issue @MicBry described. My .csv had over 100 records, and in one field, the first 100 or so records were null, but after that they contained integers. When I uploaded to Portal, it would auto-assign the field in question as "string" and overriding the field type did nothing.

After doing some testing, I think I was able to narrow my case down to this:

  • From what I can tell, for a given field, if there is at least one value (e.g. string, number, etc.) present in the first 100 records, it will auto-assign the field type based on the values of the first 100 records. If you open the attached file 'test_100.csv', you'll see that for records 1-99, the values in the field "Int_Field_AutoDetect_Test" are null and record #100 contains an integer. When you upload the file to Portal, it auto-assigns the field as "Integer", as expected.
  • However, if a field only has nulls in the first 100 records, it will auto-assign the field as "string", even if records after #100 are populated. If you use the attached file 'test_101.csv', you'll see that for records 1-100, the values in the field "Int_Field_AutoDetect_Test" are null and record #101 contains an integer. When you upload the file to Portal, it auto-assigns the field as "String".
  • Similarly, if the first 100 records only contain numeric values, and anything after record #100 contained alphanumeric values, it would auto-assign the field as "Integer" and anything with letters/symbols would end up blank. If you use the attached file 'test_101b.csv', you'll see that for the first 100 records, the values in the field "Control_Int_Field" are either integer or null and record #101 contains an string. When you upload the file to Portal, it auto-assigns the field as "Integer" and record 101 has no value for "Control_Int_Field"

I get that it's a matter of performance, but I'd prefer it take its time and ensure the data is correct rather than process it quickly with the chance it could be wrong.

We're on Enterprise 10.9.1, so I tried my test files on AGOL and it behaved differently. AGOL would auto-assign the field to "string" regardless of the first 100 records (I didn't want to spend more time trying to figure out the magic number where auto-assigning would work on AGOL, but I figure Esri can). Oh and overriding on AGOL works, but it's not straightforward. When I tried to change the field type, the only option it would give me was "String".

JCable_0-1706035424777.png

 

But if I set the field type display filter to "String"...

JCable_1-1706035459227.png

THEN it allows me to select "Integer"

JCable_2-1706035510976.png

Amazingly enough, the override actually worked!

 

@MicBry - any chance your data had fields where the first 100 records were null?

Anyway, a workaround I used was to populate the first row of any field(s) where the first 100 rows were null with an appropriate value to ensure it auto-assigned correctly, then I removed the values once the feature layer was created.

0 Kudos