I wrote a model to process a large number of rasters. It asks for an input folder and it will iterate through each raster to perform the steps. Some of the raster outputs are very large, and I will occasionally get an error and stop the whole model.
Attached is screen shot of my model. It will only fail at:
(1) the "raster to polygon" step for shapefile exceeding 2GB limit or
(2) the "dissolve" step for lack of memory. (Note: none of the data is 'intermediate' because of the memory issues, the dissolve polygon can just be extremely large with over 3 million features)
What I am looking for A WAY TO KEEP THE MODEL RUNNING. If a step fails, is there a way that I can keep the iteration going and move on to the next raster? It takes ~3hours to process each raster and I need to be able to run this overnight and on the weekends. If it fails on the first raster, I'm loosing too much of my processing time so I want it to at least keep running.
PS: If you have ideas on eliminating the failures altogether, that would be helpful too! I have almost 300 of these to do.