I recently wrote a geoprocessing script that makes heavy use of arcpy.MakeNetCDFTableView_md to analyze locations against 1000s of NetCDF files containing climate model data. Despite using Delete_management on the resulting TableViews as soon as I was done with them, memory seemed to never be freed up. By the time it had processed about 150 of the NetCDF files, ArcMap was using nearly 2 gigs of memory and my computer had nothing else to offer. It was also painfully slow, with each file taking more than 10 seconds to process.
After trying and failing to find why memory was leaking, I decided to run the script in the IDLE python shell rather than in ArcMap. Both of my problems were suddenly solved. Memory was freed up properly after tables were deleted and the memory use of the script was stable. Further, processing time dropped by more than an order of magnitude, with each file now taking less than a second to process (this despite the python only using 25% of my CPU, whereas ArcMap sucked up everything while I was running the script there).
I haven't run into this issue with any of the other scripts I've written.
So I didn't post this as a question because I sort of solved it by not running the script within arcmap. However, I thought I'd bring it up anyways to provoke some discussion.