Raster Algebra - a question of resolution?

2272
2
Jump to solution
08-14-2014 02:50 AM
BenLeslie1
Occasional Contributor III

Hi

I'm trying to do some algebra with two height rasters.... Just a simple Raster1 minus Raster2 to get a height difference.

The trouble is, I think, that the two rasters are different resolutions and when I inspect my resulting TIFF I do not agree with any of the values:

eg. if I inspect a random pixel

     Raster1 = 265

     Raster2 = 228

     OutputRaster = 97 (of course I expect the value to be 37)

The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up).  Logically the output should have the same resolution of the more detailed input.

I there something I can do so this works intelligently with rasters of differing resolutions?

0 Kudos
1 Solution

Accepted Solutions
curtvprice
MVP Esteemed Contributor
The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up).  Logically the output should have the same resolution of the more detailed input.

To get the behavior you want, set the cell size environment to MINOF. (The default is MAXOF, so you don't "invent" data detail that does not exist.)

ArcGIS 10.2 help: Cell Size (Environment setting)

View solution in original post

2 Replies
M__TanerAktas
New Contributor II

Resample the raster which has larger grid cell size to match the grid cell size of the other raster.

Then do the calculation again.

0 Kudos
curtvprice
MVP Esteemed Contributor
The resolution of the output is the same as the input raster with the lower resolution (but the pixels do not line-up).  Logically the output should have the same resolution of the more detailed input.

To get the behavior you want, set the cell size environment to MINOF. (The default is MAXOF, so you don't "invent" data detail that does not exist.)

ArcGIS 10.2 help: Cell Size (Environment setting)