Hi!
I am experiencing trouble when I uploading a csv to ArcGIS Enterprise Portal as a hosted table. When uploading the table I am making sure to map the fields to the appropriate field type. My table contains text and numbers (separated using different fields). I am mapping the fields contaning numbers to the field type "double", but when the data is uploaded I have noticed that all fields are changed to string.
I have not had this issue before. I have tried to upload new data as well as a copy of my previous data (which I could map up successfully before), but now niether option is working. I get the same result each time I try. Since portal thinks of my numbers as texts I am having trouble using the data as intended.
Does anyone recognize this problem? I am trying to figure out if the issue lies within my data or portal. My organization swtiched to enterprise 11.1 in December 2023. Might that have something to do with it?
Thanks!
If a field for the designated type contains an unsupported or invalid format, the field will be created as string data type in the resulting hosted feature layer even if you change the field type before publishing.
I would hazard a guess that there are values in your CSV that are not in the correct format and this is why it gets auto-converted to string. You can check the cell alignment of your CSV in Excel.
Source - I helped a lot of users with this issue and largely contributed to this article:
If you are extremely confident that it is NOT the CSV file (try testing on a smaller dataset) then we can consider other issues.
Thank you for a quick and informative answer!
I did not notice anything wrong with my table - however, I tried to download it as excel instead of csv. When I uploaded this file (excel) everything worked. I did not change the content of the data, only the format.
It seems that the problem occured because of the csv format. I have created all table data as a google form, so it is possible that the problem is connected to google, rather than ArcGIS Enterprise Portal. This has all worked for me before, so I cannot say for sure why it did not work for me this time around.
There could be a few reasons. CSVs can get weird if there's unusual characters particularly in the heading. Or the columns don't align to the values.
Personally I'd recommend putting tables into a database prior to upload i.e. import into a file geodatabae with ArcGIS Pro. If you want to work with CSV/Excel and aren't confident in the source, just take a browse through the data. Some of the checks and hints in the article outline some of the more common reasons.
If this was helpful and there's no further questions please mark my previous answer as the solution to help others and give me sweet internet points.
I also recently came across the same issue @MicBry described. My .csv had over 100 records, and in one field, the first 100 or so records were null, but after that they contained integers. When I uploaded to Portal, it would auto-assign the field in question as "string" and overriding the field type did nothing.
After doing some testing, I think I was able to narrow my case down to this:
I get that it's a matter of performance, but I'd prefer it take its time and ensure the data is correct rather than process it quickly with the chance it could be wrong.
We're on Enterprise 10.9.1, so I tried my test files on AGOL and it behaved differently. AGOL would auto-assign the field to "string" regardless of the first 100 records (I didn't want to spend more time trying to figure out the magic number where auto-assigning would work on AGOL, but I figure Esri can). Oh and overriding on AGOL works, but it's not straightforward. When I tried to change the field type, the only option it would give me was "String".
But if I set the field type display filter to "String"...
THEN it allows me to select "Integer"
Amazingly enough, the override actually worked!
@MicBry - any chance your data had fields where the first 100 records were null?
Anyway, a workaround I used was to populate the first row of any field(s) where the first 100 rows were null with an appropriate value to ensure it auto-assigned correctly, then I removed the values once the feature layer was created.