<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Memory leak makeNetCDFRasterLayer in Python Questions</title>
    <link>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318329#M24723</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have to go through 30 years of Solar Incoming Surface Radiation Data and for each hour I have one NetCDF v3&amp;nbsp;(.nc) file, so I have 262800 files to go through.&lt;/P&gt;&lt;P&gt;I made a little Python script (see attachment)&amp;nbsp;which loops through all these files and makes a Raster Layer from those files with "MakeNetCDFRasterLayer_md" and then I use "ZonalStatisticsAsTable_sa" with "mean" as the statistics type and a shape file with the NUTS statistical regions of Germany to get the mean Solar Radiation of each NUTS region.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;My problem is, that I can see in the Task Manager that the RAM usage keeps increasing with each loop and when it reaches about 990.000K ArcMaps can't process any more NetCDF files. Then I have to restart ArcMaps and restart my script. Because it reaches this state after it went through about 680 files, I would have to restart ArcMaps 387 times to go through all my files!&lt;/P&gt;&lt;P&gt;As far as I tested it, the memory leaks with the "MakeNetCDFRasterLayer_md", altough I use "Delete_management" to delete the Raster Layer from the memory.&lt;/P&gt;&lt;P&gt;I also tried to write the Raster Layer to disk, but the memory just keeps leaking (and makes the whole process a lot slower).&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I also tried to use the ModelBuilder with the "Iterate Files" function and MakeNetCDFRasterLayer, but I have the same problem there as well.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm using ArcMaps 10.4.1.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Maybe someone can help me.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Mon, 12 Dec 2016 07:43:57 GMT</pubDate>
    <dc:creator>RobertGaugl</dc:creator>
    <dc:date>2016-12-12T07:43:57Z</dc:date>
    <item>
      <title>Memory leak makeNetCDFRasterLayer</title>
      <link>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318329#M24723</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have to go through 30 years of Solar Incoming Surface Radiation Data and for each hour I have one NetCDF v3&amp;nbsp;(.nc) file, so I have 262800 files to go through.&lt;/P&gt;&lt;P&gt;I made a little Python script (see attachment)&amp;nbsp;which loops through all these files and makes a Raster Layer from those files with "MakeNetCDFRasterLayer_md" and then I use "ZonalStatisticsAsTable_sa" with "mean" as the statistics type and a shape file with the NUTS statistical regions of Germany to get the mean Solar Radiation of each NUTS region.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;My problem is, that I can see in the Task Manager that the RAM usage keeps increasing with each loop and when it reaches about 990.000K ArcMaps can't process any more NetCDF files. Then I have to restart ArcMaps and restart my script. Because it reaches this state after it went through about 680 files, I would have to restart ArcMaps 387 times to go through all my files!&lt;/P&gt;&lt;P&gt;As far as I tested it, the memory leaks with the "MakeNetCDFRasterLayer_md", altough I use "Delete_management" to delete the Raster Layer from the memory.&lt;/P&gt;&lt;P&gt;I also tried to write the Raster Layer to disk, but the memory just keeps leaking (and makes the whole process a lot slower).&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I also tried to use the ModelBuilder with the "Iterate Files" function and MakeNetCDFRasterLayer, but I have the same problem there as well.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm using ArcMaps 10.4.1.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Maybe someone can help me.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Robert&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Mon, 12 Dec 2016 07:43:57 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318329#M24723</guid>
      <dc:creator>RobertGaugl</dc:creator>
      <dc:date>2016-12-12T07:43:57Z</dc:date>
    </item>
    <item>
      <title>Re: Memory leak makeNetCDFRasterLayer</title>
      <link>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318330#M24724</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Are the results of the process being added to ArcMap as you are going? &amp;nbsp;have you run the process outside of an arcmap session with the same results?&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 13 Dec 2016 03:10:20 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318330#M24724</guid>
      <dc:creator>DanPatterson_Retired</dc:creator>
      <dc:date>2016-12-13T03:10:20Z</dc:date>
    </item>
    <item>
      <title>Re: Memory leak makeNetCDFRasterLayer</title>
      <link>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318331#M24725</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I made a workaround:&lt;/P&gt;&lt;P&gt;I have a main Python script, which opens my second Python script as a subprocess. The second Python script goes through 200 files and then gets closed (so I never hit a RAM limit or some other limit). The main Python script reopens the second Python script until there are no files left anymore.&amp;nbsp;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 13 Dec 2016 12:44:43 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318331#M24725</guid>
      <dc:creator>RobertGaugl</dc:creator>
      <dc:date>2016-12-13T12:44:43Z</dc:date>
    </item>
    <item>
      <title>Re: Memory leak makeNetCDFRasterLayer</title>
      <link>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318332#M24726</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Good. That probably means there are *loc files being created and a del statement is of no use. &amp;nbsp;Next time you run your script see if that is the case by monitoring your folder... unless of course you are working in a geodatabase where this won't be possible and python has little or no control over what goes on there&amp;nbsp;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 13 Dec 2016 12:49:44 GMT</pubDate>
      <guid>https://community.esri.com/t5/python-questions/memory-leak-makenetcdfrasterlayer/m-p/318332#M24726</guid>
      <dc:creator>DanPatterson_Retired</dc:creator>
      <dc:date>2016-12-13T12:49:44Z</dc:date>
    </item>
  </channel>
</rss>

