Hello helpful people!
I have a raster with population data that has a resolution of 30.8 x 30.8 meters. Each grid cell has a number of population living in this grid cell, there are grid cells in between which don't have any population and there are areas with higher population. Values range from 0 to about 50.
To make analysis of the numbers easier I want to resample the grid to a resolution of 100m (-> 1 ha per grid cell). With the resample tool I get pixels of 100m resolution but the way the value of these new pixels is calculated does not meet my needs. With the nearest methodology it takes over the value of the underlying pixels but when I now multiply this value with the factor of the change of the resolution and do a zonal statistics the totals don't match the original population (the differences range between -18 and +13%). And visually I can also see big differences.
The alternative way which seems mathematically better would be to resample first down to 1m, recalculate the values and then do an Aggregate with the SUM method. Buuuuuut resampling down to 1m resolution takes forever. (Tried in ArcGIS 10.6 and ArcPro 2.2)
I found a "Resample function" Resample function—Help | ArcGIS Desktop where I would think the Average method could give better results but I couldn't find out yet if this can be applied to a raster like mine and how I would use it.
Any other suggestion how to get a better result?