I am trying to calculate my green chlorophyll index (GCI), one of my many vegetation indices that I'm trying to calculate. I am working with imagery from a Micasense RedEdge 3 camera (with proper radiometric calibration procedures already completed). I thought that you are supposed to get data values between -1 and 1, but instead, my values range from 0.09-14.4, as you can see by the image that I have attached.
The equation for the GCI, as defined by the literature, is: (NIR/Green) - 1
My dataset bands are defined here, for this example: Green = "band_2" and NIR = "band_5"
The equation I used for Raster Calculator is this: Float("band_5") / Float("band_2") - 1
Any help would be greatly appreciated! Thanks guys!
(P.S. This is a picture of a grain sorghum research plot)
That doesn't look like a normalized index.
If band 5 has a value of 100 and band 2 has one of 50... ie raw pixel values, then you won't get a number that falls in that range. As the band values supposed to be 'raw' or normalized?
Imagery and Remote Sensing might be a better place to move this since it really isn't an ArcGIS Pro issue
Agreed with Dan
https://www.indexdatabase.de/db/i-single.php?id=128 confirms you have the correct equation for GCI but it's clearly not a normalized index. If NIR=250 and green = 1, GCI will be 249
-1 < GCI < 254, and you'll also have to check for Green = 0 to avoid division by zero.
I did not get a chance to normalize this data. Since it is not a normalized dataset, I am thinking to just use the data as is.
If you absolutely have to normalize it, I would try to extract the data (i.e. using the "zonal statistics as table" tool) and then normalize it in this method: https://stats.stackexchange.com/questions/70801/how-to-normalize-data-to-0-1-range
Not claiming to be an expert, but this might help!