Hello! I am struggling with understanding the three types of RMS error reported while georeferencing in ArcGIS Pro. All of the documentation seems to include this paragraph (emphasis added):
The forward residual shows you the error in the same units as the data frame spatial reference. The inverse residual shows you the error in the pixel units. The forward-inverse residual is a measure of how close your accuracy is, measured in pixels. All residuals closer to zero are considered more accurate.
OK, so I understand how forward and inverse RMSE are calculated, but what the heck is forward-inverse?! The description is a non-explanation; it sounds exactly the same as inverse RMSE. Obviously it isn't, though, because it is given as a fractional decimal less than 1. Can anyone explain or provide the mathematical formula for forward-inverse RMSE so I can understand what it actually is and why it might be useful to interpret along with the other two?
It is obvious only that the programmer and the author of the help do not understand what they actually did (formula), i.e. wrote. A more detailed explanation in this regard has already been requested from users, but has not been provided by ESRI.
I have encountered the same issue. I haven't found a sufficient explanation of how these metrics are calculated. Specifically, I find the difference between the green and red lines vague. How are they determined, and what do they indicate?
I copied the following blurb from Stack Exchange regarding a question about a "reverse root mean square value":
The 2 formulas on this page seem to show a familiar root-mean-square formula (sqrt (1/N)(sum 1-N xi^2)), followed by what looks like an "inverse" of the same formula (1/N sum 1-N sqrt (|xi|))^2. This might be the difference, I just have not verified this calculating against residuals from my latest georeferencing run.