Mosaic dataset operator versus cell statistics - What is wrong ??

3964
23
Jump to solution
12-06-2017 09:08 AM
AlbertoAloe
Regular Contributor

Hi guys.

I have the following problem (probably due to my lack of confidence when using the mosaic dataset):

I have a set of 366 rasters (one for each day in 2008) with the following specs:

  • 1000 columns, 950 rows
  • 1 band
  • Cell size is 5 km
  • TIF format
  • 32 bit floating point
  • Coordinate system is ETRS_1989_LAEA

The question is :

Why am I getting different results when calculating mean with spatial analyst cell statistics versus a mosaic dataset loaded with the same set of rasters ? There may be a difference of almost 2%

 

Cell statistics seems to be the correct one because I'm getting the same values in other ways. Rasters are coming from a big netcdf and I verified the avg values of the mean of the same slices using numpy (as well as other tools like sample and multi dimension ones..)

I'm using ArcGIS 10.5.1 on Windows 8.1 workstation with 16 GB of RAM and I've been very carefull to set up the mosaic dataset with the correct properties(pixel depth of the mosaic is 32 bit and maximum size of requests are properly set). I also created a mosaic dataset based on the original netcdf (9131 temporal steps) getting the same results of the tif based mosaic when making a query definition for 2008 year and settimg operator to mean. Values for the rasters are in range 0 - 10000 more or less

Thanks a lot to whoever is able to give me a hint

Alberto

0 Kudos
23 Replies
AlbertoAloe
Regular Contributor

Dan,

no news. I was thinking to involve ESRI support and, for this purpose,  I created a script that replicates exactly what I'm doing. Given an input folder containing the 366 rasters, it creates a geodatabase with a mosaic dataset pointing to the rasters and setting all the relevant properties. It also outputs three rasters:

  1. ./data/outputMeanRasters/MeanFromCellStats.tif containing the mean we know is correct
  2. ./data/outputMeanRasters/MeanFromMosaic.tif containing the mean from the mosaic
  3. ./data/outputMeanRasters/PercentVar.tif containing the percent variation

The strange thing is that the previous script works with the 'None' mosaic method meaning that rasters are sorted according to ObjectID in the mosaic (although  for the sake of calculating the mean it should be irrelevant). If you try to use 'By Attribute'  using SupportNr field (which is created by the script in a way that the sorting is different from objectID ascending) you get a mean raster from the mosaic which is slightly different from the one gotten before. (??)

Zip file in attachment. Unzipping it and double clicking the .py replicates everything (It works with 10.5.1...for older versions the signature of arcpy.SetMosaicDatasetProperties_management may be slightly different.

Alberto

0 Kudos
AlbertoAloe
Regular Contributor

Dan,

after contacting ESRI support they recognized it as a bug for both ArcGIS and ArcGIS Pro.

BUG-000113559 : Mosaic operator ‘Mean’ produces the incorrect results in ArcMap
BUG-000113661 : Mosaic operator ‘Mean’ produces the incorrect results in ArcGIS Pro

It should be fixed in following releases of the software

Ciao

Alberto

DanPatterson_Retired
MVP Emeritus

Thanks for the update... I will check this in 2.2 beta

So for the interim... numpy wins???

0 Kudos
XingboChen
Emerging Contributor

Hi, Alberto,

I have some kind similar problem with the "max" mosaic operator. For example, when I set mosaic operator into "max" and some cell values in the mosaic dataset and the original rasters which are DEM files in ESRI grid format are different. But if I set mosaic operator into "first" I can get the same result. Do you have any ideas, is this the same bug from the bug you mentioned, because I can't find BUG-000113559 on the ESRI website.

0 Kudos