Hi
I'm trying to do some algebra with two height rasters.... Just a simple Raster1 minus Raster2 to get a height difference.
The trouble is, I think, that the two rasters are different resolutions and when I inspect my resulting TIFF I do not agree with any of the values:
eg. if I inspect a random pixel
Raster1 = 265
Raster2 = 228
OutputRaster = 97 (of course I expect the value to be 37)
The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up). Logically the output should have the same resolution of the more detailed input.
I there something I can do so this works intelligently with rasters of differing resolutions?
Solved! Go to Solution.
The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up). Logically the output should have the same resolution of the more detailed input.
To get the behavior you want, set the cell size environment to MINOF. (The default is MAXOF, so you don't "invent" data detail that does not exist.)
ArcGIS 10.2 help: Cell Size (Environment setting)
Resample the raster which has larger grid cell size to match the grid cell size of the other raster.
Then do the calculation again.
The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up). Logically the output should have the same resolution of the more detailed input.
To get the behavior you want, set the cell size environment to MINOF. (The default is MAXOF, so you don't "invent" data detail that does not exist.)
ArcGIS 10.2 help: Cell Size (Environment setting)