I have a web service of elevation data. The service has global extent and contains many different elevation datasets of different resolutions.
From time to time I need to add or subtract datasets to the derived mosaic used to publish the service.
Due to the large quantity of data used, Calculate Statistics takes around two weeks to complete. Every time the service is modified Calculate Statistics needs to be re-run.
Due to the length of time to complete there is a heightened risk of Calculate Statistics failing due to external factors. This has occurred on several occasions which necessitated restarting Calculate Statistics on several occasions thereby exacerbating the time taken.
The current "fix" suggested by support is to implement 64-bit processing to decrease the time taken to run Calculate Statistics.
What would be really useful would be the ability to Calculate Statistics incrementally on each of the component datasets and then to aggregate these statistics into the statistics for the derived mosaic rather than having to calculate statistics as a whole on the derived mosaic. There would also need to be an ability to subtract statistics in similar manner.