Calculating MISE for a kernel density estimation

Discussion created by mr_bennyc on Jan 21, 2012

I'm working with a set of spatial data and am looking to run kernel density estimation on it. However I'm struggling a little to decide the best bandwidth option. I've seen from reading around that the best bandwitdh option is often one with the lowest Mean Integrated Squared Error (MISE). Does anyone know how to calculate this on an already completed kernel?

If it helps, I calculated my kernels using the tool GME (Geospatial Modelling Environment).

Many thanks