MXNetError: [22:38:03] C:\ci\libmxnet_1533399150922\work\src\engine\threaded_engine.cc:318: Check failed: device_count_ > 0 (-1 vs. 0) GPU usage requires at least 1 GPU
CUDA environment no error
MXNetError: [22:38:03] C:\ci\libmxnet_1533399150922\work\src\engine\threaded_engine.cc:318: Check failed: device_count_ > 0 (-1 vs. 0) GPU usage requires at least 1 GPU
CUDA environment no error
What’s the output of nvidia-smi
?
±----------------------------------------------------------------------------+
| NVIDIA-SMI 376.51 Driver Version: 376.51 |
|-------------------------------±---------------------±---------------------+
| GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce GTX 960M WDDM | 0000:01:00.0 Off | N/A |
| N/A 36C P8 N/A / N/A | 31MiB / 4096MiB | 0% Default |
±------------------------------±---------------------±---------------------+
±----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 9252 C+G …crosoft.LockApp_cw5n1h2txyewy\LockApp.exe N/A |
| 0 10036 C+G …eShell.Experiences.TextInput.InputApp.exe N/A |
±----------------------------------------------------------------------------+
hmm. What about the output of mx.test_utils.list_gpus()
?
I have solved this error. The reason for this error is that the mxnet version does not match. Thank you
I met the same error.
And my nvidia-smi
output:
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 410.79 Driver Version: 410.79 CUDA Version: 10.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 GeForce RTX 208... On | 00000000:1A:00.0 Off | N/A |
| 27% 29C P8 27W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 1 GeForce RTX 208... On | 00000000:1B:00.0 Off | N/A |
| 27% 26C P8 4W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 2 GeForce RTX 208... On | 00000000:1C:00.0 Off | N/A |
| 27% 27C P8 16W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 3 GeForce RTX 208... On | 00000000:1D:00.0 Off | N/A |
| 27% 27C P8 21W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 4 GeForce RTX 208... On | 00000000:1E:00.0 Off | N/A |
| 27% 27C P8 22W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 5 GeForce RTX 208... On | 00000000:3D:00.0 Off | N/A |
| 27% 27C P8 1W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 6 GeForce RTX 208... On | 00000000:3E:00.0 Off | N/A |
| 27% 26C P8 20W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 7 GeForce RTX 208... On | 00000000:3F:00.0 Off | N/A |
| 27% 27C P8 20W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 8 GeForce RTX 208... On | 00000000:40:00.0 Off | N/A |
| 27% 27C P8 5W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
| 9 GeForce RTX 208... On | 00000000:41:00.0 Off | N/A |
| 27% 27C P8 2W / 250W | 0MiB / 10989MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
my mx.test_utils.list_gpus()
output:
>>> import mxnet as mx
>>> mx.test_utils.list_gpus()
range(0, 0)
>>>
I solved this problem by update the nvidia driver to NVIDIA-SMI 418.67