Missing values after running Raster to ASCII

2826
6
09-11-2017 09:04 AM
DataOfficer
Occasional Contributor III

I am trying to convert a number of raster datasets saved in a File Geodatabase to ASCII using the Raster to ASCII tool. I have tried this in both ArcGIS Pro 2.0 and ArcCatalog 10.5. 

 

For some reason, the resulting ASCII file has a different range of values from the input raster each time. In the example below, the input raster (seen on the left) has values between 6.70958 to 10.7875 while the resulting ASCII file (seen on the right) ranges from 6.70958 to 9.28548. As you can see, the ASCII also looks quite different visually. I have applied the same symbology to both (Stretch, Percent clip) so this shouldn't be what's causing it. 

 

What happened to those values? Is there another way of creating an ASCII file that I could try? 

0 Kudos
6 Replies
curtvprice
MVP Esteemed Contributor

Is there any compression going on? Resampling based on the raster environment? The one on the right looks resampled (the edges are a different shape). Make sure the environment extent and cell size are set to the input before converting.

DataOfficer
Occasional Contributor III

Thanks for the reply. Unfortunately, this has made no difference. The input raster is in fact a resampled file itself as I have converted what was originally WGS1984 to OSGB1936 and have resampled the cell size using the 'Resample' tool (Cubic). Could this be a problem? 

0 Kudos
DataOfficer
Occasional Contributor III

Update: The underlying files are actually identical and displaying accordingly when opened in Desktop 10.5. This problem only occurs in Pro, which is incredibly frustrating as I'm deliberately trying to move away from Desktop but Esri are making it hard to do so. 

0 Kudos
DanPatterson_Retired
MVP Emeritus

How are you assessing a difference?  Is it a visual? symbology? assessment or one based on the actual values?

There is nothing in the ascii to raster tool that would account for a change in values.  Is there something in the Raster environments that might point to issues?  The latter would depend on the raster type you are saving to.

You could always use numpy (example here) to convert the ascii files to rasters which won't change input values at all, just to confirm what you have.

DataOfficer
Occasional Contributor III

I'm assessing the difference by looking at the raster statistics under 'Properties', which show me that the minimum, maximum, standard deviation etc. are the same. I think the difference in visual display is a bug in Pro 2.0 as this problem does not occur in Desktop 10.5. 

0 Kudos
DanPatterson_Retired
MVP Emeritus

It would be worthwhile bundling up data for testing since you have a test case.  Tech support can assess whether it is a bug or a variant in the classification settings that isn't readily apparent.

At least you have ruled out changes in the base values... and there is no interpolation of values occurring which can alter values and hence ranges.