<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How much javaHeapSize is too much? in ArcGIS Enterprise Questions</title>
    <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604774#M41997</link>
    <description>&lt;P&gt;C'mon who has the biggest javaHeapSize setting??&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":face_with_tongue:"&gt;😛&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 10 Apr 2025 15:12:45 GMT</pubDate>
    <dc:creator>TimHaverlandNOAA</dc:creator>
    <dc:date>2025-04-10T15:12:45Z</dc:date>
    <item>
      <title>How much javaHeapSize is too much?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1600066#M41909</link>
      <description>&lt;P&gt;Hi I have a MapServer service whose arcsoc crashes due to not enough java heap space. This happens when a /query request is made for all features, geometry, and attributes and the return format is pjson. The max records returned for the service is the default 2000.&lt;/P&gt;&lt;P&gt;I have found that if I set the javaHeapSize for the service to 312 (MB) that the query will be processed by the service and data begins to download.&lt;/P&gt;&lt;P&gt;If in addition orderByFields are specified in the query, I have to increase the javaHeapSize value to 1024 (MB).&lt;/P&gt;&lt;P&gt;I only have a max of two instances for this service so increasing the javaHeapSize to 1024 still appears to keep memory utilization for the machine in a healthy zone. It doesn't appear that the entire 1024 MB are used right away; just perhaps that the arcsoc has more "headspace" to execute larger queries.&lt;/P&gt;&lt;P&gt;I'm wondering if others have had to increase the javaHeapSize beyond the 64MB default. Who can report the biggest, baddest javaHeapSize of them all??&lt;/P&gt;&lt;P&gt;Tim&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 27 Mar 2025 18:56:11 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1600066#M41909</guid>
      <dc:creator>TimHaverlandNOAA</dc:creator>
      <dc:date>2025-03-27T18:56:11Z</dc:date>
    </item>
    <item>
      <title>Re: How much javaHeapSize is too much?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604774#M41997</link>
      <description>&lt;P&gt;C'mon who has the biggest javaHeapSize setting??&amp;nbsp;&lt;span class="lia-unicode-emoji" title=":face_with_tongue:"&gt;😛&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Apr 2025 15:12:45 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604774#M41997</guid>
      <dc:creator>TimHaverlandNOAA</dc:creator>
      <dc:date>2025-04-10T15:12:45Z</dc:date>
    </item>
    <item>
      <title>Re: How much javaHeapSize is too much?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604797#M42000</link>
      <description>&lt;P&gt;I managed dozens of ArcGIS Server deployments, stand-alone and federated.&amp;nbsp; Generally, the default works for a vast majority of services, but there are a few where the value has to be increased.&amp;nbsp; Typically the reason the service crashes is due to overly dense geometries being serialized by someone scraping the API to download the data.&lt;/P&gt;&lt;P&gt;Typically we will try doubling it once (128 MB) to see if that resolves the matter.&amp;nbsp; If not, we double it again (256 MB), and if that doesn't work then we start alternating between reducing record count by half and doubling the javaHeapSize.&amp;nbsp; So, if 256 MB doesn't work then we cut the max record count to 1000 (the old default for over a decade).&amp;nbsp; If that doesn't work we try increasing heap again to 512 MB and finally cutting max record count to 500.&lt;/P&gt;&lt;P&gt;My general belief is that a service needing a javaHeapSize greater than 512 MB probably needs to have its data restructured on the back end.&amp;nbsp; I can't think of any instances where I have allowed a javaHeapSize greater than 512 MB.&lt;/P&gt;</description>
      <pubDate>Thu, 10 Apr 2025 15:39:03 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604797#M42000</guid>
      <dc:creator>JoshuaBixby</dc:creator>
      <dc:date>2025-04-10T15:39:03Z</dc:date>
    </item>
    <item>
      <title>Re: How much javaHeapSize is too much?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604804#M42001</link>
      <description>&lt;P&gt;Hi Joshua,&lt;/P&gt;&lt;P&gt;You're on top of the leaderboard with javaHeapSize=512MB. Congrats!&lt;/P&gt;&lt;P&gt;I'm going to use your method to try to keep the javaHeapSize value within 512MB.&lt;/P&gt;&lt;P&gt;I also think we may have one or two extremely complex polygons that need simplification.&lt;/P&gt;&lt;P&gt;Thanks for relating your deep experience.&lt;/P&gt;</description>
      <pubDate>Thu, 10 Apr 2025 15:51:31 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1604804#M42001</guid>
      <dc:creator>TimHaverlandNOAA</dc:creator>
      <dc:date>2025-04-10T15:51:31Z</dc:date>
    </item>
    <item>
      <title>Re: How much javaHeapSize is too much?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1650715#M43056</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.esri.com/t5/user/viewprofilepage/user-id/427597"&gt;@TimHaverlandNOAA&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;I once had a custom printing service that had to be able to pint huge PDFs (maps in A0 with many layers on it). Whenever I implemented it at a customer's installation I had to set the HeapSize to 1024MB to work properly.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 16 Sep 2025 15:58:44 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/how-much-javaheapsize-is-too-much/m-p/1650715#M43056</guid>
      <dc:creator>GIS-Chris</dc:creator>
      <dc:date>2025-09-16T15:58:44Z</dc:date>
    </item>
  </channel>
</rss>

