I have tried running MXNet 1.3.1 with CUDA9.2 under Windows10 on a Dell Laptop that has both a Intel graphics adapter and a nVidia 1060GTX. Somehow when I use mx.gpu(0) as context on any ndarray operation withi this context I’ll get following error:
mxnet.base.MXNetError
Message=[23:24:34] c:\jenkins\workspace\mxnet-tag\mxnet\src\engine\threaded_engine.cc:320: Check failed: device_count_ > 0 (-1 vs. 0) GPU usage requires at least 1 GPU
StackTrace:
C:\local\Anaconda3-4.1.1-Windows-x86_64\lib\site-packages\mxnet\base.py:252 in check_call
C:\local\Anaconda3-4.1.1-Windows-x86_64\lib\site-packages\mxnet_ctypes\ndarray.py:92 in _imperative_invoke
Also when I mx.test_utils.list_gpus() I’ll get an empty list back. But MXNet seems to be the only software that I’ve got installed that is unable to fnd my mobile nVidia GPU other software that uses CUDA is well able to find the adapter. Has anyone else experienced problems like this and if yes how can I fix it so that I can use my mobile GPU?