Select to view content in your preferred language

Adding an GPU to a Enterprise Deployment?

1149
3
10-16-2023 08:54 AM
John_Davis709
New Contributor

Hello World,

I am relatively new to running an ArcGIS Enterprise Deployment (Windows 11.0) and working through selecting specifications for a migration to new hardware. Our current deployment cannot handle the demands of multiple heavy users and applications at the same time. This is likely down to old server hardware.

For our next deployment we are trying to see if there is any benefit to adding a dedicated graphics card to the server machine. The main benefit we hope to achieve is faster rendering of federated map image layers and feature layers. The main offender is an Experience Builder app with 20+ Feature Layers that is slow to draw and creates issues and slow downs for all users.

Has anyone used a dedicated GPU in there deployment and did you find any benefit? Is having a dedicated GPU beneficial to increasing web map and application speed and usability?

I am appetitive for any help!

Thanks,

John

3 Replies
MarceloMarques
Esri Regular Contributor

Documentation Links to help with this discussion.

Does higher power Graphics Processing Unit (GPU) improve the performance of ArcGIS Image Server?

https://support.esri.com/en-us/knowledge-base/faq-does-higher-power-graphics-processing-unit-gpu-imp...

ArcGIS Server 11.1 system requirements
https://enterprise.arcgis.com/en/system-requirements/latest/windows/arcgis-server-system-requirement...

ArcGIS Pro 3.1 system requirements
https://pro.arcgis.com/en/pro-app/latest/get-started/arcgis-pro-system-requirements.htm

| Marcelo Marques | Esri Principal Product Engineer | Cloud & Database Administrator | OCP - Oracle Certified Professional | "In 1992, I embarked on my journey with Esri Technology, and since 1997, I have been working with ArcSDE Geodatabases, right from its initial release. Over the past 32 years, my passion for GIS has only grown stronger." | “ I do not fear computers. I fear the lack of them." Isaac Isimov |
ChadKopplin
Frequent Contributor

John, thank you for your post.  Over the passed couple of weeks I just went through the same scenario with our GIS server.  We used to have two server load balanced solution.  However, a number of years ago we shutdown one of the servers, because we were not seeing the usage of the system at the time.  Our problem was that we did not add the other 2 cores of the additional RAM to our system.  Over the course of the passed 3 years we have seen an increase in the use of all 3 over aching areas of our system (mobile, web, and server/desktop) to the point that 98% of our RAM was being consumed at any moment.  We have since increased the cores and the RAM and have alleviated our speed bottle neck.  I graphics card would take move the image functions to the card, but it does not fully affect your geoprocessing that is all related to the cores and RAM.  I am afraid you would still see some potential issues, especially if the majority of the 20 layers are large vector layers.  You can apply zoom scales to these layers so that they do not show until the appropriate zoom level is reached.  Then a minimal amount of layers are showing when zoomed to the full extent.  I hope these help, just wanted to share what we have gone through and what is working for us, good luck.

berniejconnors
Frequent Contributor

You should make sure your feature layers and map image layers are performing their best before trying to solve the problem with hardware.  We were able to improve the performance of our map image layer for our parcel data with scale thresholds and labeling improvements.

Bernie.