ArcPy in Script Memory Flush

09-28-2011 10:06 AM
Status: Open
Labels (1)
New Contributor III
It would be great to be able to flush my RAM in the midst of a script to prevent my process from being prematurely cancelled out due to a lack of RAM.  I am dealing with a number of cursors and files of rows over 1 million.  Any ability to clear out the RAM during the operation would be greatly appreciated.

That is a big issue with arcpy. You can manually manage python memory use with garbage collection but the geoprocessor has separate memory space than python so you cant actually do anything to that object.
An effective memory management for ArcPy would be desirable. I am executing lots of tools each one taking the input from its predecessor. Memory usage is ever increasing - seems the occupied resources from each tool are not freed until the whole script has finished.

It would be great if there was some method on arcpy that purged the memory even if it had a performance hit. If I'm doing large data crunching I'm not going to worry over a few seconds if the data crunching takes several hours.  I was imagining something like:


Then behind the scenes this would release cursors and objects that are out of scope etc.


This is definitely an area of concern for my operation.  We are running about 30 automated geprocessing scripts at night. 

there need to be more options for tackling heavy geoprocessing than the Dice tool, and a memory flush would help greatly.
I agree with the previous comments.
My concern is also the crunching of a large database (over 4gb and millions of entries).
After the database, which i want to prozess, reaches a certain size arcpy gives me some error.
When i check my RAM load everything is clear. Even when i use something like "arcpy.Delete_management" there is no reduction in memory load.
Please optimize this. Currently arcpy is unusable in the way i like.

Maybe that is why using spatialite on large datasets works so much better.