GPU not working

2167
6
Jump to solution
02-14-2023 02:38 AM
Labels (3)
AdrienMichez
New Contributor II

Hi,

I'm trying to use GPU capabilities to train a deep learning model in arcgis pro using the 'train deep learning model' tool. I used the msi installation to install deep learning capabilities. It's working fine but is so slooooooow!

From other posts, I checked that Cuda is properly installed and that the python command  torch.cuda.is_available() returns True.

I've activated GPU in environment/processor type but looking at task manager, my GPU does not seem to work hard, neither the CPU's.

AdrienMichez_0-1676370607166.png

I also check with the nvidia-smi monitoring tool which confirms that no running processes are found:

AdrienMichez_1-1676370832451.png

I'm using a GeForce GTX 770. Maybe this model is not compatible? It's not brand new nvidia

Best regards

Adrien

 

 

0 Kudos
1 Solution

Accepted Solutions
AdrienMichez
New Contributor II

Thanks

@PavanYadav: yes I was setting environment just like that

I choose to buy a new GPU NVIDIA GeForce RTX 4090

That helped a lot 🙂

But I'm facing other problems... If you want to have a look here :

https://community.esri.com/t5/arcgis-image-analyst-questions/rare-habitat-mapping-with-deeplearning-...

View solution in original post

0 Kudos
6 Replies
DanPatterson
MVP Esteemed Contributor

You checked the faq? 

Deep learning frequently asked questions—ArcGIS Pro | Documentation


... sort of retired...
0 Kudos
AdrienMichez
New Contributor II

Yes I did


I think this is probably due to my old GPU but all the tests/checks found online came out positive so I don't have the confirmation that it is actually this issue

0 Kudos
PavanYadav
Esri Contributor

@DanPatterson  thank you for sharing the FAQs link. 

I'm copying/pasting info from the link above so in case other users refer to this thread:

The recommended VRAM for running training and inferencing deep learning tools in ArcGIS Pro is 8GB. If you are only performing inferencing (detection or classification with a pretrained model), 4GB is the minimum required VRAM, but 8GB is recommended.


If you do not have the required 4–8GB VRAM, you can run the tools on the CPU, though the processing time will be longer.

@AdrienMichez  looks like geforce-gtx-770 is 2GB memory size. (https://www.techpowerup.com/gpu-specs/geforce-gtx-770.c1856)

0 Kudos
PavanYadav
Esri Contributor

@AdrienMichez I am curious if you were setting GPU in the Processor Type 

PavanYadav_1-1679428819550.png

 

You may also want to try to check:
>>> import torch
>>> torch.cuda.is_available()

True

>>> torch.cuda.device_count()

1

>>> torch.cuda.current_device()

0

>>> torch.cuda.device(0)

<torch.cuda.device at 0x7efce0b03be0>

>>> torch.cuda.get_device_name(0)

'GeForce GTX 950M'

Even though with a 2GB GPU, you might not see much performance improvements. I just wanted to share the above, and you should be able to use it. 

0 Kudos
AdrienMichez
New Contributor II

Thanks

@PavanYadav: yes I was setting environment just like that

I choose to buy a new GPU NVIDIA GeForce RTX 4090

That helped a lot 🙂

But I'm facing other problems... If you want to have a look here :

https://community.esri.com/t5/arcgis-image-analyst-questions/rare-habitat-mapping-with-deeplearning-...

0 Kudos
PavanYadav
Esri Contributor

@AdrienMichez thanks for sharing. For the other issue, I discussed with my coworkers. I see one of them have already responded to you the thread. Hope, what she shared can help resolve issue. 

0 Kudos