Although you have configured the Processor Type for GPU and the sample export has been performed with the GPU, when performing the Training the GPU is not being used, is this the expected behavior for this type of operation? See image.
Solved! Go to Solution.
@GeoprocessamentoCerradinho Yup! That is the expected behavior. As long as CUDA is in use for training and Nvidia smi also shows the GPU in use, we are good to go. If you have more resources left over, you may be able to bump up your batch size.
Did you see associated threads?
Solved: Workstation specifications, for deep learning - Esri Community
and
Hii @GeoprocessamentoCerradinho ,
Could you tell us which specific GPU (Make and model) is in use for the training process? (Device Manager > Display Adapters OR Task Manager > Performance > GPU)
1. In the task manager, select the GPU that you are expecting to be used for the training process. You will see multiple windows such as 3D, copy, Video Encode etc. On the top-left of one of those windows, you should be able to see a drop down arrow. Change the value to 'CUDA'.
2.In command prompt > type 'nvidia-smi -l 3'. This should show us a detailed usage of the GPU during training (refreshed every 3 seconds)
While the training process is running, could you capture both and post a screenshot here?
Should look like this:
Thanks for reply, I believe that the result of the training is very wrong:
Hii @GeoprocessamentoCerradinho,
Based on your response on the other thread, I am guessing you were able to get the model working.
Were you able to replicate the above steps to ensure that GPU was indeed being used while training the deep learning model?
Yes, I realized that although the CPU is being used to the maximum, the GPU is used in about 50% (CUDA value in task manager), I don't know if this is the expected behavior, I think so.
@GeoprocessamentoCerradinho Yup! That is the expected behavior. As long as CUDA is in use for training and Nvidia smi also shows the GPU in use, we are good to go. If you have more resources left over, you may be able to bump up your batch size.