Select to view content in your preferred language

ArcGIS Pro Deep Learning v3.2 still not using multiple GPUs

1442
6
11-28-2023 08:14 AM
Labels (1)
JadedEarth
Frequent Contributor

I'm using ArcGIS Pro and Deep Learning API v.3.2.  My machine has multiple NVIDIA GPUs:  4 RTX A4000 and 2 RTX A6000.  In addition, my machine also has the Microsoft Basic Display Adapter and Microsoft Remote Display Adapter, 28 cores, 1 TB of RAM and running Microsoft Windows 10 Enterprise.

According to here, if I use model parameter U-net, ArcGIS will use all available GPUs.  I selected GPU as my processor type and left GPU ID as blank, hoping it would use all available GPUs.

However, based on Task Manager display, the CPU is doing most of the work.  CPU_trainModel.JPG

I also tried nvidia-smi to check the performance and it's also showing that the GPUs are practically not being used.

NvidiaSmi.JPG

The following shows my parameter and environment settings:

Parameters.jpgEnvironment.jpg

What am I doing wrong?  Appreciate any help.

6 Replies
DanPatterson
MVP Esteemed Contributor

@PavanYadav based on your multiple responses to previous questions by @JadedEarth could you report on the assertion that multiple/all GPUs will be used if available.

Perhaps a more directed case list of configurations in the documentation is needed, although I find the existing documents clear, except on this GPU point.


... sort of retired...
0 Kudos
PavanYadav
Esri Regular Contributor

Hi @DanPatterson 
Currently, the Train Deep Learning tool supports multiple GPUs for the following model types: 

  • ConnectNet
  • Feature classifier
  • MaskRCNN
  • Multi Task Road Extractor
  • Single Shot Detector
  • U-Net

We are currently working on supporting multiple GPUs for inferencing. 

thanks

Pavan

Pavan Yadav
Product Engineer at Esri
AI for Imagery
Connect with me on LinkedIn!
Contact Esri Support Services
0 Kudos
DanPatterson
MVP Esteemed Contributor

See the first post. The user is using U-Net for training


... sort of retired...
0 Kudos
JadedEarth
Frequent Contributor

The computer we're using is NOT a desktop computer since we needed to install multiple GPUs.  Here's the computer photo below.  We also added multiple GPUs on this computer.

Arstxtem-gpu_computer.JPG

0 Kudos
JadedEarth
Frequent Contributor

Now, I'm told multi-gpu support only works from command line.  Does anyone know where the instructions are located for command line deep learning instructions for multi-gpu?

0 Kudos
PavanYadav
Esri Regular Contributor

@JadedEarth The Train Deep Learning tool of ArcGIS support multiple GPUs. for select model types. We are hoping to have all inferencing support it in 3.3 release. And, yes the API has more support training. 

https://developers.arcgis.com/python/guide/utilize-multiple-gpus-to-train-model/

https://pro.arcgis.com/en/pro-app/latest/tool-reference/image-analyst/train-deep-learning-model.htm 

Pavan Yadav
Product Engineer at Esri
AI for Imagery
Connect with me on LinkedIn!
Contact Esri Support Services
0 Kudos