Installation

I think the most direct way to get mxnet running is the following:

  1. Open google colab: https://colab.research.google.com
  2. Install mxnet with gpu support: !pip install mxnet-cu100
  3. Check the version:
    import mxnet as mx; mx.__version__

It will install mxnet with cuda 10 support.

2 Likes

Hello!

I have followed everything without problems up to the point where I have to update the environment description in environment.yml for mx-net with cuda support. My current description is mxnet==1.4.0 which I changed to mxnet-cu101==1.4.0 as my cuda version is 10.1.
However, when I run:
conda env update -f environment.yml
I get the following error:
Could not find a version that satisfies the requirement mxnet-cu101 (from -r /home/nenko/miniconda3/d2l-en/condaenv.alwxqubp.requirements.txt (line 1)) (from versions: )
No matching distribution found for mxnet-cu101 (from -r /home/nenko/miniconda3/d2l-en/condaenv.alwxqubp.requirements.txt (line 1))

Any help would be much appreciated!

mxnet is currently unavailable for cuda version 10.1
Try replacing cuda 10.1 with 10.0

Hi! Any help here would be much appreciated. I have Cuda 10.0 installed and running, so when I updated my environment.yml I changed mxnet to mxnet-cu100. However, when I run the code on jupyter notebook:

import mxnet as mx

I receive the error:

libcudart.so.10.0: cannot open shared object file: No such file or directory

How can I solve this issue?

@gregor114, you need to make sure cuda is available in your dynamic library path.

export LD_LIBRARY_PATH=/usr/local/cuda:/usr/local/cuda/lib64:/usr/local/lib:/usr/lib:/usr/local/cuda/extras/CUPTI/lib64:$LD_LIBRARY_PATH
export PATH=/usr/local/cuda/bin:$PATH

also make sure that your /usr/local/cuda is a symlink to /usr/local/cuda-10.0
by running ls -la /usr/local/cuda, you should see:

lrwxrwxrwx 1 root root 19 Feb 25 17:34 /usr/local/cuda -> /usr/local/cuda-10.0
1 Like

I think the best way to setup the environment would be Docker container with preinstalled packages. When you discuss Linux you refer to apt, but Debian/Ubuntu is not the only flavor of Linux.

Hi, I cloned the repo and noticed all the chapter notebooks are saved as .md files. Might be a silly question with a simple fix, but how do I run these? I installed this a few weeks ago and they were .ipynb as normal. Do I have to change them somehow? How can I do this with all of them? Why are they saved as .md in the first place?

Note: If I simply change the extension, the notebooks do not load.

I have solved the issue. The github is the chapter markdowns, the actual code for the book should be gotten by following the install instructions. I simply thought one could fork the repo and run the code via github.

I don’t have NVIDIA . I have intel graphic .So it showing error on install cuda.How i further proceed in installation process .I am also not able to get install mxnet it is showing “Could not install packages due to an EnvironmentError: [WinError 5] Access is denied: ‘c:\programdata\anaconda3\lib\site-packages\idna-2.8-py3.7.egg-info\dependency_links.txt’
Consider using the --user option or check the permissions.”.

how did you do that ?..could you please elaborate

This appears broken for me. I’m on Ubuntu 18.04 with Python 3.6.5 using 4.5.4. I’m trying to follow the instructions, but when I test the installation by running ‘import d2l’ in the notebook I get:

ImportError: cannot import name ‘linreg’

update: it looks like the following commit broke the d2l import: https://github.com/d2l-ai/d2l-en/commit/241260f6a015e9d16c25984159c77bab64e22c4d

Checkout out the previous commit worked for me.

If you want any information regarding the installation then have a visit on kaspersky antivirus error 1922 where you can learn it on details.

would be nice if the d2l downloaded code was labeled with the corresponding chapters.
For example, 2.1.1: getting started is located in

d2l-en/chapter_crashcourse/ndarray.ipynb

If it was in

d2l-en/2.crashcourse/1.ndarray.ipynb

it would be a lot easier to find.
Or atleast have the notebooks have the same name as the chapter: 2.1 Data manipulation -> data_manipulation.ipynb instead of ndarray.ipynb

Very neat project !
Soon all hard science textbook will be in this format.
Just one question for now:
is there an index available for the downloaded version ?
In “mxnet-the-straight-dope-master” there was a README.md file
that could be converted to ipynb and worked as en Index.

For MacOSX, the command for “Installing Running Environment” should be

$ pip install https://apache-mxnet.s3-accelerate.amazonaws.com/dist/python/numpy/latest/mxnet_mkl-1.5.0-cp37-cp37m-macosx_10_11_x86_64.whl

When trying to install the d2l dependency on google colab , I am getting a prompt toolkit error . The prompt toolkit error is because colab jupyter kernel uses an older version while the d2l uses the newer version. How do I solve this?

Hi I’m new to this book and am excited to get into learning. I have followed the steps for installation and am able to get jupyter notebook to launch properly however in the first part of lesson two when I try to run the code “from mxnet import mx” I get an OS error stating that I have a missing module, but I have already pip installed mxnet. Can someone please let me know what is going on it is extremely frustrating.

Could anyone help with this mismatch error? I got it to work previously with @ThomasDelteil 's advice on April 10th, but following the advice again now has not resolved the issue.

The exact error I get, upon trying to import mxnet as mx in the notebook, is the following:

Following the same advice, which is to symlink cuda and make sure cuda is in the dynamic path library:

After doing this I still receive the same issue. Mxnet-cu100 is installed via pip

Quick note: I see the MiniConda/Conda Envs part of the installation is no longer there? Is pip install preferred now?

Thanks for any help, I really appreciate it.

Hi, I have Spyder (Python 3.7.4) open and am running the first code lines. “jupyter notebook” always seems to give me error: SyntaxError: invalid syntax. Anyone knows why?