I have used Gaussian Geostatistical Simulation in ArcGIS , but I would like to understand better how the method works. When the number of simulation is increased, the global mean from the output map is almost constant when the “mean output”; and the global mean from the output map is increasing when the “maximum output”.

Hi Carine,

What you are seeing is to be expected. The Gaussian Geostatistical Simulations tool works by creating many different rasters that are all simulated from the same geostatistical model. When you output the mean raster, you are creating a raster where each cell is the average of all the simulated values in that raster cell. This average value should not change drastically as you increase the number of simulations because the mean value will stabilize fairly quickly. When you output a maximum raster, you are creating a raster where each cell is the maximum of all the simulated values in that raster cell. Unlike the mean, the maximum will keep getting bigger and bigger as you increase the number of simulations.

Please let me know if you need more clarification, but what you are seeing is what is supposed to happen.