Hi
I am using the spatial analyst zonal statistics tool to get the standard deviation of elevation. For zones, I have a coarser raster with each cell uniquely numbered.
The tool runs without error, but the results for standard deviation (std) have zero in all zones.
Running the same analysis for mean or maximum gives the correct values.
And, running the same analysis after converting the zonal grid to polygons gives correct values.
Is this a known bug? Or perhaps I missed something important.
cheers
Dan
Looks to me like your standard deviations are being calculated from a single pixel (the center of each large cell). This will always give you zero - no variation from one value! It worked for you with polygons because by default the cell size used was your value grid cell size.
The default cell size used is usually the max of the inputs (this varies by tool). If you set the geoprocessing environment cellsize to the value grid before the tool is run, you may get the results you're looking for.
Of course this means your std value is replicated for all cells in the zone - this seems very strange until you figure out that this allows zonal statistics to be used as part of a complex map algebra expression, for example, to identify all the cells in each zone that share the maximum zone value:
ZonalStatistics(zonegrid, "VALUE", valuegrid, "MAXIMUM") == valuegrid
I would be remiss to mention that statistics calculated using the zonal tools are by definition area (cell area) weighted - it is best practice to get a more accurate weighting by projecting the raster data before doing these calculations.
By the way this is important!!
You can specify the output cell size or resolution, or you can take the default. The default cell size, or resolution, for analysis results is set to the largest cell size of all the input raster datasets for the tool.
ArcGIS Help (10.2, 10.2.1, and 10.2.2): Environment settings for raster data