<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Can GPU improve performance of Image Server? in ArcGIS Enterprise Questions</title>
    <link>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1020528#M29340</link>
    <description>&lt;P&gt;Hey there Min,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp;Thank you for your question. I researched this for a bit with my colleagues and found that Image Server does infact benefit from GPU's, but not for processing and rendering mosaic datasets. At 10.8.1 it seems that Deep learning workflows take advantage of GPU's see here: &lt;A href="https://enterprise.arcgis.com/en/image/latest/raster-analytics/configure-and-deploy-arcgis-enterprise-for-deep-learning-raster-analytics.htm#ESRI_SECTION1_C30D73392D964D51A8B606128A8A6E8F" target="_self"&gt;Configure ArcGIS Image Server for deep learning raster analytics&lt;/A&gt;&lt;/P&gt;&lt;P data-unlink="true"&gt;"&lt;SPAN&gt;Support is available for input raster from data store, and raster analysis&amp;nbsp;&lt;/SPAN&gt;service&amp;nbsp;&lt;SPAN&gt;&amp;nbsp;for single node multiple GPU parallel processing for Deep Learning."&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp;I am continuing my research into this, and will reply an update if I find anything further.&lt;/P&gt;</description>
    <pubDate>Wed, 27 Jan 2021 16:02:21 GMT</pubDate>
    <dc:creator>JonEmch</dc:creator>
    <dc:date>2021-01-27T16:02:21Z</dc:date>
    <item>
      <title>Can GPU improve performance of Image Server?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1019525#M29316</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am wondering if having one or more GPUs can improve overall performance of Image Server.&lt;/P&gt;&lt;P&gt;Can standard image services benefit from GPU? For example, processing and rendering mosaic dataset.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your answer in advance.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers,&lt;/P&gt;&lt;P&gt;Min&lt;/P&gt;</description>
      <pubDate>Mon, 25 Jan 2021 03:59:41 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1019525#M29316</guid>
      <dc:creator>GBSKorea</dc:creator>
      <dc:date>2021-01-25T03:59:41Z</dc:date>
    </item>
    <item>
      <title>Re: Can GPU improve performance of Image Server?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1020528#M29340</link>
      <description>&lt;P&gt;Hey there Min,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp;Thank you for your question. I researched this for a bit with my colleagues and found that Image Server does infact benefit from GPU's, but not for processing and rendering mosaic datasets. At 10.8.1 it seems that Deep learning workflows take advantage of GPU's see here: &lt;A href="https://enterprise.arcgis.com/en/image/latest/raster-analytics/configure-and-deploy-arcgis-enterprise-for-deep-learning-raster-analytics.htm#ESRI_SECTION1_C30D73392D964D51A8B606128A8A6E8F" target="_self"&gt;Configure ArcGIS Image Server for deep learning raster analytics&lt;/A&gt;&lt;/P&gt;&lt;P data-unlink="true"&gt;"&lt;SPAN&gt;Support is available for input raster from data store, and raster analysis&amp;nbsp;&lt;/SPAN&gt;service&amp;nbsp;&lt;SPAN&gt;&amp;nbsp;for single node multiple GPU parallel processing for Deep Learning."&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp;I am continuing my research into this, and will reply an update if I find anything further.&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jan 2021 16:02:21 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1020528#M29340</guid>
      <dc:creator>JonEmch</dc:creator>
      <dc:date>2021-01-27T16:02:21Z</dc:date>
    </item>
    <item>
      <title>Re: Can GPU improve performance of Image Server?</title>
      <link>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1020909#M29348</link>
      <description>&lt;P&gt;Hi Jon,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your reply. As my client is not going to do any Deep Learning stuff, maybe there is no gain from having a GPU.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Having said that, your research sounds interesting and will benefit others with the same needs.&lt;/P&gt;&lt;P&gt;Keep us updated!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Min&lt;/P&gt;</description>
      <pubDate>Thu, 28 Jan 2021 08:22:00 GMT</pubDate>
      <guid>https://community.esri.com/t5/arcgis-enterprise-questions/can-gpu-improve-performance-of-image-server/m-p/1020909#M29348</guid>
      <dc:creator>GBSKorea</dc:creator>
      <dc:date>2021-01-28T08:22:00Z</dc:date>
    </item>
  </channel>
</rss>

