I am iterating a model 1000 times to simulate the effect of sample point numbers and spacing on tree canopy cover estimates. The model generates random point locations, buffers those points, and clips the buffered area. Sample points are selected based on the clipped buffer area and a percent canopy cover is calculated based on the sample points and based on the actual area of the clipped buffer relative to the entire area. These values are placed into a personal geodatabase table via make tableview and calculate field.
As the model iterates, memory usage increases and increases (now at around 225 Megs after 3 hours) while the model iterates seemingly slower and slower. This particular run is taking more than 3 hours which seems kind of ridiculous considering the simplicity of the operations being performed in each iteration.
My assumption is that model data for each iteration is building up over time and thereby slowing things down. There does not appear to be any setting that allows for deleting at each iteration whatever is building up.
Any suggestions welcomed to speed up what I am doing. Thank you for your time and attention.
I would attach an image of the model, but I am in the midst of a model run now. Will attach later if seeing the model would assist in an assessment of what I may be doing wrong.