ArcGIS.Learn issues with new GPU

381
2
01-12-2021 06:39 PM
DavidGaworski1
New Contributor

After recently upgrade by GPU from a 980ti to a 3070 I have not been able to get the arcgis.learn / fastai to execute properly anymore. None of my system resources are bogged up and it runs fine on my other machine with a weaker i5 / 1060 6GB. 

Has anyone had issues with using these libraries on newer RTX cards? I'm working on a very small dataset and its taking over 30 minutes just to load a show_batch() even after doing a new install of windows. Once it does load, the images are very blurry and did not render properly. I cannot even get started on a lr_find() or fit().

 

I'm using ArcGIS 2.6 with the machine learning msi installed with juptyer notebooks.

Show_Batch_Slow.PNGCode.PNG

0 Kudos
2 Replies
DanPatterson
MVP Esteemed Contributor

You will probably have better luck finding stuff on their github site

Esri/arcgis-python-api: Documentation and samples for ArcGIS API for Python (github.com)

Check issues open/closed for starters.


... sort of retired...
0 Kudos
DuncanHornby
MVP Notable Contributor

I too am unable to run any of the deep learning tools on my new machine with a clean install of ArcPro 2.8. and a new RTX 3070 card. I installed the deep learning packages using the install on the github website. I often get a bad token error message when trying to train pointing to "ModelFile" setting in the emd file. If I am some how luck to get passed that the actual detect object tool fails with ArcPro bombing out to that report error dialog. I also note that the GPU never gets used even though I am expressing selecting it on the environment setting.

I've come to the conclusion that the underlying libraries of code that the ArcPro tools use are currently simply flaky and incompatible with an RTX 3070

0 Kudos