AnsweredAssumed Answered

"Standard Error" GA Layer to Points != "Prediction Standard Error Map"

Question asked by dwetta on Jul 8, 2014
Latest reply on Mar 23, 2018 by EKrause-esristaff

I thought I had my head around cross-validation in Geospatial Analyst, but I find the following apparent discrepancy confusing/troubling.

 

=====

 

I have split my input data into training and test data sets using "Subset Features".

 

I have built a simple kriging model using the training data set.

 

I use GA Layer to Points using the training data set kriging model layer, and select the test data set as Input Point Observation Locations, creating a "Standard Error" field (among others).  My understanding is this is the square root of the kriging variance at those points.

 

I right-click on my training data set kriging model layer and select "Change Output to Prediction Standard Error".  I thought this was also displaying the square root of the kriging variance.

 

- The values on the "Prediction Standard Error Map" are significantly lower at my test point locations than the "Standard Error" values from the GA Layer to Points step (loosely 1/5 of the value).

- I've confirmed this by converting the Prediction Standard Error Map to a Raster, and extracting values to the same test data set I used GA Layer to Points on, and comparing both values at all the points.

 

=====

 

Are the "Standard Error" values from "GA Layer to Points" not the same as the "Prediction Standard Error" from the "Prediction Standard Error Maps"?

Am I using one of the tools wrong?

 

Any clarification greatly appreciated.  Let me know if any better description/explanation/elaboration from my end will help.  I'm using ArcGIS 10.1 SP1

 

Dave Wetta

Outcomes