I have a script which iterates through a set of points and performs a variety of raster and vector analyses. Watching the Mem Usage in Windows Task Manager, it appears that more and more memory is used until ArcMap crashes at around 1 GB.
Are there any effective ways for identifying the source of the runaway memory leak?
Update: the memory appears to spike when a Geometry object is created, but deleting the object, neither through Python's "del" command nor with arcpy.Delete_management, causes it to abate. Does anybody know how to delete a geometry object?
I think it's a geoprocessing memory leak issue. I have had all sorts of problems when I do lots of looping in Python using the geoprocessor. There did not seem to be solution.
The only improvement I could make was not to have any of the data in a personal geodatabase but as Shapefiles. This was especially true when doing lots of selections. Ultimately the code would crash as it ran out of memory so I had to run it in batches with me periodically closing down ArcMap. Rediculous really, you would have though they had fixed all that as it's been a continuing problem from 9.3.
I am definitely not sure, so I am totally guessing. I suspect that Delete_management isn't made to interact with python/arcpy objects directly, same as many arcpy functions. It's possible that esri is slowly adding support to act on these, but if it were supported, I don't see any reason for an in_memory feature class/set. I tried this and got okay results:
line_shapefile = "C:\Users\jhook\Desktop\lines.shp"
output = "in_memory/output"
for i in range(0, 100, 1):
geom_obj = arcpy.CopyFeatures_management(line_shapefile, output)
I am guessing that most arcpy functions act on file-like objects, so by creating in_memory file like objects, people could use those functions. I don't know anything about future plans or anything. this is all guessing. I would stick with in_memory feature classes for now.
On another quick note, the del statement doesn't actually delete objects, it dereferences the name from the object, which *may* result in deletion. The garbage collector decides that stuff. There are probably references to the object somewhere deep in arcpy or something. Again, hard tellin' not knowin'
Thanks for all the information. It's hard to believe that ESRI has let this problem go on for so long, isn't iterative geoprocessing one of the primary reasons for scripting?
Does anybody know if there's a way to manipulate the garbage to collector to purge all but a few memory items between iterations? The only Python-related workaround I've come across is to spawn separate processes, but this seems to be somewhat cumbersome and complicated (I'm trying to keep my code relatively simple for ease of customization).