I wrote a model to process a large number of rasters. It asks for an input folder and it will iterate through each raster to perform the steps. Some of the raster outputs are very large, and I will occasionally get an error and stop the whole model.
Attached is screen shot of my model. It will only fail at:
(1) the "raster to polygon" step for shapefile exceeding 2GB limit or
(2) the "dissolve" step for lack of memory. (Note: none of the data is 'intermediate' because of the memory issues, the dissolve polygon can just be extremely large with over 3 million features)
What I am looking for A WAY TO KEEP THE MODEL RUNNING. If a step fails, is there a way that I can keep the iteration going and move on to the next raster? It takes ~3hours to process each raster and I need to be able to run this overnight and on the weekends. If it fails on the first raster, I'm loosing too much of my processing time so I want it to at least keep running.
Thanks
PS: If you have ideas on eliminating the failures altogether, that would be helpful too! I have almost 300 of these to do.
I'm not sure how to do this in ModelBuilder, but you could export your model to Python and add in some error handling there. One thing to try would be to test for file size and number of features and skip those that are too big (using if-then statements). Another thing to do if you don't want to test for size first, is to use a try-except statement. You'd have a "for" loop to pass through all your data and then inside that would be a "try" and an "except" to catch the errors with a "continue" under "except" to let the model know it can skip on to the next item in the "for" loop.
Hi
Wrap your Raster to Polygon and Dissolve steps into a Calculate Value python code block and use a try/except block. Alternatively you can integrate a script tool built from those GP tools. You may want to add an output boolean as a precondition to the dissolve so it doesn't run on empty polygon output.
The desired format is a model so that it can be easily used by others or modified later.
I like the idea of integrating a script into the model. I'm just not as familiar with doing it that way (I either write the whole script myself or make a model, I'm still learning how to do them together)
Do either or you have example scripts of your suggestions?
Is there a way to wait until there's an error and tell the model to just move on to the next? Or do I have to predict what all the errors might be an write something to catch them all? Basically my task is 2 fold 1) process 300 rasters and 2) have a robust model that can be rerun later without any errors
Hi
Integrating a script into a model:
Integrating scripts within a model—Help | ArcGIS for Desktop
The easiest way to make the script is to copy a Python snippet from a Result made manually, then edit it.
Using Calculate Value is a bit more basic, the thing to watch is to protect path inputs using raw text for the variable value:
r"%MyPathVariable%"
That will stop Python interpreting some escape characters.
Hi,
Iterators are really convenient but has this limitation as you have encountered. You can of course go around the limitation by scripting or you can also remove the iterator in your model and then batch run it. As far as I can remember, failed runs will not hold everything else back.
Any particular reason why you want to write the %val%_poly to a shapefile instead of into a feature class in an fgdb. your 2gb limit will disappear.
Best wishes for the New Year!
Cheers,
Mike
Thanks! There was no particular reason I went with a shapefile versus a feature class. I just learned about the size allowances today from a coworker and I think that will help fix a lot of the problems.
How do you set up a batch run with GIS? I've only done it before with just a shell script. My "final" model is needs to be something anyone can run through the tool interface in GIS.
After you've created the model (minus the iterator) right click on it and select Batch....
Thanks! My final product needs to be user friendly and the batch won't work for that. But I will definitely use it when I'm processing on my own from now on.