Deep Learning Not Using GPU

1901
6
07-05-2021 08:10 PM
Stevohu
New Contributor II

G'day, I am very new to the world of ArcGIS and would like to set up a deep learning model to automatically detect objects within an image. I have created the necessary chipped images (Using "export  training data for deep leaning) for it however when I use "Train Deep Learning Model" and I select it to use my GPU (RTX 2070) it just uses my CPU. I have followed the Guide called: "Deep Learning With ArcGIS Pro Tips and Tricks: Part 1". However when I get to the very end step of using the import torch and fastai command in the python window, no results are shown. 

If anyone could help me out that would be fantastic!!

Guide for settting up deep learning

https://www.esri.com/arcgis-blog/products/arcgis-pro/imagery/deep-learning-with-arcgis-pro-tips-tric...

Tags (2)
0 Kudos
6 Replies
bhariadi
New Contributor III

Hi Stevohu,

Have you test whether your CUDA is available in your DL Environment? And make sure the result returning 'true'.

0 Kudos
Stevohu
New Contributor II

When I go to test that I do not get any results, no "True" or "false" coming up. I thought I installed CUDA correctly.

0 Kudos
bhariadi
New Contributor III

Hi Stevohu,

I followed the guided steps from the link you mention above, and the results was 'True'.

Capture13.JPG

Another way is using Python notebook, the result also 'true'.

Capture14.JPG

'Shift' + 'Enter' to run the codes

 

0 Kudos
Stevohu
New Contributor II

I used Python Notebook and got this result.

Stevohu_0-1625715841300.png

So I ran the "Train Deep Learning Model" with the folllowing settings:

Model Type: Single Shot Detector

Zooms: 1.0

Ratios: [1.0, 1.0]

Chip Size: 256

Backbond Model: ResNet-101

Processor Type: GPU

GPU ID: 0

When it started running it maxed out my CPU (i5 9600k @ 5.1ghz) but it was not using my GPU at all

0 Kudos
BrennaOlson
New Contributor II

I'm having the exact same problems. Did you ever get it figured out?

0 Kudos
bhariadi
New Contributor III
0 Kudos