I recently wrote a geoprocessing script that makes heavy use of arcpy.MakeNetCDFTableView_md to analyze locations against 1000s of NetCDF files containing climate model data. Despite using Delete_management on the resulting TableViews as soon as I was done with them, memory seemed to never be freed up. By the time it had processed about 150 of the NetCDF files, ArcMap was using nearly 2 gigs of memory and my computer had nothing else to offer. It was also painfully slow, with each file taking more than 10 seconds to process.
After trying and failing to find why memory was leaking, I decided to run the script in the IDLE python shell rather than in ArcMap. Both of my problems were suddenly solved. Memory was freed up properly after tables were deleted and the memory use of the script was stable. Further, processing time dropped by more than an order of magnitude, with each file now taking less than a second to process (this despite the python only using 25% of my CPU, whereas ArcMap sucked up everything while I was running the script there).
I haven't run into this issue with any of the other scripts I've written.
So I didn't post this as a question because I sort of solved it by not running the script within arcmap. However, I thought I'd bring it up anyways to provoke some discussion.
sounds about right... arcmap adds its own overhead as well as it decides when stuff gets released even if python is done with it. Overwriting an existing files would an option and the only environment parameter that is honored is the current workspace. Try it in Pro, python 3.5 offers some speedup generally as well over some parts of 2.7.. so everything adds up.
I'll bring this up to see if there is a way it can be made more efficient at the tool level.
Delete_management may not be helping because it deletes data from disk (First sentence in Summary - Delete) Meanwhile, MakeNetCDFTableView_md holds the Table Views in memory (First point in Usage - MakeNetCDFTableView)
As Dan points out, the third point in Usage states 'An existing table view will be overwritten if the same table view name is entered.' If you can use the same name repeatedly it will effectively handle the delete for you, and should have a steady memory footprint.
It also may be a good idea to use a Generator Expression instead of lists/loops? Generators are very helpful when reaching memory limits, as they process a single item at a time. Under the hood this often allows the resources to be released earlier than they would otherwise.