Hi guys.
I have the following problem (probably due to my lack of confidence when using the mosaic dataset):
I have a set of 366 rasters (one for each day in 2008) with the following specs:
- 1000 columns, 950 rows
- 1 band
- Cell size is 5 km
- TIF format
- 32 bit floating point
- Coordinate system is ETRS_1989_LAEA
The question is :
Why am I getting different results when calculating mean with spatial analyst cell statistics versus a mosaic dataset loaded with the same set of rasters ? There may be a difference of almost 2%
Cell statistics seems to be the correct one because I'm getting the same values in other ways. Rasters are coming from a big netcdf and I verified the avg values of the mean of the same slices using numpy (as well as other tools like sample and multi dimension ones..)
I'm using ArcGIS 10.5.1 on Windows 8.1 workstation with 16 GB of RAM and I've been very carefull to set up the mosaic dataset with the correct properties(pixel depth of the mosaic is 32 bit and maximum size of requests are properly set). I also created a mosaic dataset based on the original netcdf (9131 temporal steps) getting the same results of the tif based mosaic when making a query definition for 2008 year and settimg operator to mean. Values for the rasters are in range 0 - 10000 more or less
Thanks a lot to whoever is able to give me a hint
Alberto