Greetings,
I am trying to compute Map Algebra using the Raster Calculator in ArcGIS 10.5 to find the mean of a set of raster (.tif) layers. The raster layers have an 8-bit radiometric resolution so I have 256 ‘shades’ of data in each raster layer. However, after the computation, my new composite raster only has 252-254 possible data values. I kept the default parameters in the Environment settings, so I am wondering whether I missed in the Environment, or is this a glitch with ArcMap 10.5?
Thank you,
Brent
Have you tried CellStatistics
and with regards the RasterCalculator issue, do all the rasters have the same spatial reference, cell size and origin?
and, for example, (256+256+200)/3 = 237 which means that the output could have a value less than 256. Maybe I'm not understanding what you are saying.
Hi Steve,
Yes, all of the rasters have the same spatial reference, 1km cell size, and origin as they were retrieved from the same dataset. Sorry, I should have mentioned that some cells from the input rasters do contain missing data, so might this issue be compounded after the computations?
Cell Statistics was well-suited for this computation, but I am still curious as to why there is data loss with Map Algebra.
Thank you!
Brent
Brent
Time for some debugging 🙂
Subtract the RasterCalculator output from the CellStats output and find a pixel where they are not zero. Find the value at this location for all your input rasters and calculate the mean. What do you see?
-Steve