# Intermediate raster leads to incorrect final output in arcpy map algebra

Question asked by JakeNederend on Aug 16, 2017
Latest reply on Aug 18, 2017 by JakeNederend

I want to calculate an index and can't figure out why 2 seemingly identical scripts produce very different results. The input raster is a multiband GeoTiff (32-bit float) and I need to use the RGB bands as inputs. The RGB bands first need to be normalized to a scale of 0-1 (e.g. Red/Red_max), then the chromatic coordinates need to be determined (e.g. red/red+green+blue), and finally the index (ExG = 2*green-red-blue).

Since my data are already on the 0-1 scale, I can skip that step but still encounter problems. Can someone please explain why this:

red = Float(Raster(inraster+"/Band_1")/(Raster(inraster+"/Band_1")+Raster(inraster+"/Band_2")+Raster(inraster+"/Band_6")))
green = Float(Raster(inraster+"/Band_2")/(Raster(inraster+"/Band_1")+Raster(inraster+"/Band_2")+Raster(inraster+"/Band_6")))
blue = Float(Raster(inraster+"/Band_6")/(Raster(inraster+"/Band_1")+Raster(inraster+"/Band_2")+Raster(inraster+"/Band_6")))
ExG = 2*green - red - blue

produces the correct output. But the following does not:

red = Raster(inraster+"/Band_1")
green = Raster(inraster+"/Band_2")
blue = Raster(inraster+"/Band_6")
red_cc = Float(red/red+green+blue)
green_cc = Float(green/red+green+blue)
blue_cc = Float(blue/red+green+blue)
ExG = 2*green_cc - red_cc - blue_cc

I did the math for a single pixel by hand and the former is definitely correct. Eg.

The original inputs prior to finding the chromatic coordinates are: Red = 0.427023, Green = 0.405449, Blue =0.349515

The ExG calculates to 0.02907.

When using the first method, I get 0.029070, but using the second I get 0.080466.