I'm writing a daily soil water balance model with arcpy. I have precip, ET, vegetation, soil thickness inputs which are used in classes to calculate model parameters. The model runs great, but not if I try to run it for the whole year. It typically makes it to about day #200 then gives the dreaded "ERROR 010240: Could not save raster dataset to <value> with output format FGDBR." I open ArcCatalogue to find a mess of 500 or so rasters in my output fgdb (clean if the run completes). I've tried overWriteOuput = True, using scratchWorkspace, deleting my scratch gdb every day (doesn't work, locked), and now I'm saving every intermediate raster and then deleting it the next day, which runs very slowly.
The problem started when I coded in a bunch of methods to find cellvalueatpoint so I could make a list of point values through the year to use as time series on a plot.
I understand there is a TB limit on the fgdb, but I don't even have a TB on my comp. Is this related to RAM (I have 16 GB), hard drive, some limit I'm not aware of?
Is there some better way to delete these temporary rasters?