Hi,
We are experiencing a issue with weird csv/excel files when downloading a table from the Dashboard.
Please see attachment: "decimal issue"
I have also attached how it looks in our sql server database (2019): "sql server overview", "sql server desgin".
Workflow:
-Used ArcGIS Pro to get the feature class from our SDE.
-Published as feature service. Checked datatype on service level and see that it is Double.
-Added the service to the map consumed by the Dashboard.
-Try to download the table.
We are using Enterprise 11.2.
Tried setting datatype from numeric to nvarchar in the database. Then it shows identical as the table in the database, with correct amount of decimals. However it doesn't work to filter on number values when it has been set to string.
Tried publishing a new service with another feature class. The result is identical with too many decimals.
Have anyone experienced similar issue?
Could it be related to regional settings somehow? Interpreting , and . as separator differently.
Are the decimals in the CSV? Excel does some pretty random formatting, I would check the display settings of those cells.
Yes, the decimals are included in the CSV file as well. The CSV file contains even more decimals than the feature class/table within the database, which is strange.
Update: Also tried to use FME to round the decimals to max 3 digits and specify the writer to use 3 digits, but still when downloading the csv file. It still shows between 10 and 15 digits.