Memory leak makeNetCDFRasterLayer

1760
3
Jump to solution
12-11-2016 11:43 PM
RobertGaugl
New Contributor

Hi!

I have to go through 30 years of Solar Incoming Surface Radiation Data and for each hour I have one NetCDF v3 (.nc) file, so I have 262800 files to go through.

I made a little Python script (see attachment) which loops through all these files and makes a Raster Layer from those files with "MakeNetCDFRasterLayer_md" and then I use "ZonalStatisticsAsTable_sa" with "mean" as the statistics type and a shape file with the NUTS statistical regions of Germany to get the mean Solar Radiation of each NUTS region.

My problem is, that I can see in the Task Manager that the RAM usage keeps increasing with each loop and when it reaches about 990.000K ArcMaps can't process any more NetCDF files. Then I have to restart ArcMaps and restart my script. Because it reaches this state after it went through about 680 files, I would have to restart ArcMaps 387 times to go through all my files!

As far as I tested it, the memory leaks with the "MakeNetCDFRasterLayer_md", altough I use "Delete_management" to delete the Raster Layer from the memory.

I also tried to write the Raster Layer to disk, but the memory just keeps leaking (and makes the whole process a lot slower).

I also tried to use the ModelBuilder with the "Iterate Files" function and MakeNetCDFRasterLayer, but I have the same problem there as well.

I'm using ArcMaps 10.4.1.

Maybe someone can help me.

Thanks,

Robert

0 Kudos
1 Solution

Accepted Solutions
RobertGaugl
New Contributor

I made a workaround:

I have a main Python script, which opens my second Python script as a subprocess. The second Python script goes through 200 files and then gets closed (so I never hit a RAM limit or some other limit). The main Python script reopens the second Python script until there are no files left anymore. 

View solution in original post

0 Kudos
3 Replies
DanPatterson_Retired
MVP Emeritus

Are the results of the process being added to ArcMap as you are going?  have you run the process outside of an arcmap session with the same results?

0 Kudos
RobertGaugl
New Contributor

I made a workaround:

I have a main Python script, which opens my second Python script as a subprocess. The second Python script goes through 200 files and then gets closed (so I never hit a RAM limit or some other limit). The main Python script reopens the second Python script until there are no files left anymore. 

0 Kudos
DanPatterson_Retired
MVP Emeritus

Good. That probably means there are *loc files being created and a del statement is of no use.  Next time you run your script see if that is the case by monitoring your folder... unless of course you are working in a geodatabase where this won't be possible and python has little or no control over what goes on there 

0 Kudos