I have several raster mosaics both IMG and TIF format. When I load them, they display fine as Stretched, the high low values look fine. But when I try to classify the values in the layer properties, the classification statistics are totally different than the actual raster. This is a new problem that started last week. No new installs or patches. Even old rasters that used to work just fine, now have the same problem. What gives? I've done this literally thousands of times on thousands of images and never had an issue til last week.
Same issue on multiple computers.
Narrowing it down. ArcMap seems to be taking the high and low value of the statistics, subtracting them, and then using the difference in values as the high value in the classified stats. Trying to stretch or resample on it's own. So my raster high is 336, low is 171. (single band thermal image, 16bit). the classified stats are maxing out at 165 (336-171). I want to classify based on the actual pixel values not the stretched values.
Feels like a check box or random setting somewhere that spontaneously turned itself on? Latest update: If I import to a GRID, I get the reaction I want, but Tiffs and img's are a no-go.