Model iteration slows over time and memory usage increases

3931
8
01-24-2013 02:00 PM
IvanKautter
New Contributor III
I am iterating a model 1000 times to simulate the effect of sample point numbers and spacing on tree canopy cover estimates.  The model generates random point locations, buffers those points, and clips the buffered area.  Sample points are selected based on the clipped buffer area and a percent canopy cover is calculated based on the sample points and based on the actual area of the clipped buffer relative to the entire area.  These values are placed into a personal geodatabase table via make tableview and calculate field.

As the model iterates, memory usage increases and increases (now at around 225 Megs after 3 hours) while the model iterates seemingly slower and slower.  This particular run is taking more than 3 hours which seems kind of ridiculous considering the simplicity of the operations being performed in each iteration. 

My assumption is that model data for each iteration is building up over time and thereby slowing things down.  There does not appear to be any setting that allows for deleting at each iteration whatever is building up.

Any suggestions welcomed to speed up what I am doing.  Thank you for your time and attention.

I would attach an image of the model, but I am in the midst of a model run now.  Will attach later if seeing the model would assist in an assessment of what I may be doing wrong.
0 Kudos
8 Replies
IvanKautter
New Contributor III
Model graphic attached.  It took 5 hours for the model run to complete.  That's the longest it has ever taken.

I should also add that the number of points being buffered are around 25 and the clipping feature is not large at all, so it's not as if the GIS operations being conducted are at all intensive.

I'll also add that I am running this from ArcCatalog 10.0 sp5.  When the model completes, ArcCatalog appears to be holding onto all that memory it allocated over the model run.  I have to force ArcCatalog to quit otherwise it hangs for a while not responding.
0 Kudos
IvanKautter
New Contributor III
15 hours for this last model run.  It seems to be getting worse and worse with regard to amount of time it takes to run the model.
0 Kudos
SandraD
New Contributor
Hi Ivan,

If I were encountering your problem, 

1st step: I would check my computer's processing capabilities vs. that of the software if I haven't already.

2nd step: I would double-check that all data are not being stored in the in-workspace memory. With selected sets of features, often all the processing is done in the model's in-workspace memory.

I hope that helps.

-Sande
0 Kudos
IvanKautter
New Contributor III
Core i5 @ 2.80 GHz, 4 GB RAM, Windows 7 Pro 64-bit.

Feature classes and data tables are specifically not in_memory.  I am currently trying to place them in_memory and it doesn't seem to be speeding anything up.

Thanks for the feedback though.
0 Kudos
IvanKautter
New Contributor III
Same problem being reported here:

http://forums.arcgis.com/threads/28156-Geoprocessing-tool-leaking-memory, but in an ArcObjects context, yet iterative looping is involved.
0 Kudos
curtvprice
MVP Esteemed Contributor
As the model iterates, memory usage increases and increases (now at around 225 Megs after 3 hours) while the model iterates seemingly slower and slower. This particular run is taking more than 3 hours which seems kind of ridiculous considering the simplicity of the operations being performed in each iteration.


Memory leaks sure sound like your issue, but there are other things you can do that may help with your throughput:

Check and see if you are growing a lot of temporary files, or filling a workspace, finding "free" scratch names (arcpy.ScratchName) can take time. if you force your model builder loop to clean up after itself at the end of the loop (say with a Delete with the output of the process set as a precondition) you may be able to get better performance.

Do not use personal geodatabases (.mdb). Use file geodatabases instead, you will have much better results. Personal geodatabases are deprecated and are limited in size and generally perform slower.

Running Compact_management on your geodatabase at the end of the run (or with a precondition from Calculate Value set to '%n% mod 25', after every 25 runs) may dramatically improve your performance by defragmenting the gdb.

Consider putting intermediate datasets in the in_memory workspace - essentially a RAM disk, for you old codgers out there. Just make sure you delete them at the end of each iteration, or make sure they are intermediate. If the datasets are fairily small, this will work and will increase performance many times.

This is the kind of work where the 10.1 64-bit geoprocessing really helps, as you can access more than 4GB of physical memory.
0 Kudos
IvanKautter
New Contributor III
This may be totally anecdotal, but I noticed that the Desktop Indexing Service was often running while my model was running.  The Service was set to index the user folders to which I was outputting the data in a subfolder in that directory. So it may have been the case that since I was constantly altering the contents of that directory and the service was set to index very few folders, the indexing service was slowing the model execution and placing some sort of memory limit.  At present the model is clipping along at a decent pace and it using well over 256 MB of memory without a noticeable slowing in execution.
0 Kudos
KimOllivier
Occasional Contributor III
I recommend you use the model as a documentation diagram and recode it completely in Python without using any geoprocessing tools. Each tool takes time to start and writes out a lot of temporary results. They are not designed to be iterated.

Have you considered using the new geometry objects? you can use these to clip, intersect and do the operations you seem to be doing on simpler objects in memory without calling a tool.
http://resources.arcgis.com/en/help/main/10.1/index.html#//018z00000070000000
0 Kudos