I have to go through 30 years of Solar Incoming Surface Radiation Data and for each hour I have one NetCDF v3 (.nc) file, so I have 262800 files to go through.
I made a little Python script (see attachment) which loops through all these files and makes a Raster Layer from those files with "MakeNetCDFRasterLayer_md" and then I use "ZonalStatisticsAsTable_sa" with "mean" as the statistics type and a shape file with the NUTS statistical regions of Germany to get the mean Solar Radiation of each NUTS region.
My problem is, that I can see in the Task Manager that the RAM usage keeps increasing with each loop and when it reaches about 990.000K ArcMaps can't process any more NetCDF files. Then I have to restart ArcMaps and restart my script. Because it reaches this state after it went through about 680 files, I would have to restart ArcMaps 387 times to go through all my files!
As far as I tested it, the memory leaks with the "MakeNetCDFRasterLayer_md", altough I use "Delete_management" to delete the Raster Layer from the memory.
I also tried to write the Raster Layer to disk, but the memory just keeps leaking (and makes the whole process a lot slower).
I also tried to use the ModelBuilder with the "Iterate Files" function and MakeNetCDFRasterLayer, but I have the same problem there as well.
I'm using ArcMaps 10.4.1.
Maybe someone can help me.