June 2, 2018 · EN snippets

All-in-one Deep Learning Notebook

Tired of preparing your development environment again and again because of corrupted installations, new infrastructures or moving to your new virtual machine? I know how much it hurts to start all over again.

Using Docker for your dev/product/test environments is strongly suggested to overcome mobility issues of projects. Once every tiny settings are settled, you're then good to take off with your ready-to-go package.

Here I'll share my personal DL development notebook. It's an all-in-one service including keras (tensorflow + theano) and pytorch as abstraction frameworks, scikit and all other dependencies that average data scientists and deep learning enthusiasts would require. It's been structured with love and experience after many DL - data analysis tasks.

Fetch it via Docker:

docker pull fibersio/nb

It's a single user jupyter lab (with additional plugins) environment that is based on ubuntu 16.04 where python 3.5, cuda 9.0, tensorflow 1.8 and node 8 installed.

After pulling the notebook, it's pretty easy to serve it in your bare-metal/virtual machine. Type the following:

docker run -i --rm -d -p $PORT:8888 -v $PWD:/workspace/data fibersio/nb --NotebookApp.token=$PASS


Note that this will run on CPU without CUDA support.

If you have a CUDA9.0-compatible GPU and installed necessary drivers, then you'll want to use GPU-enabled notebook by just typing:

nvidia-docker run -i --rm -d -p $PORT:8888 -v $PWD:/workspace/data fibersio/nb --NotebookApp.token=$PASS

This will allocate all of your GPUs to your notebook. If you have multiple GPUs and limit the number of GPU usage, then you can easily attach each via extending the run command by NV_GPU variable, just as follows:

NV_GPU=0,1,2 nvidia-docker run -i --rm -d -p $PORT:8888 -v $PWD:/workspace/data fibersio/nb --NotebookApp.token=$PASS

If you have further dependencies to be installed, it's done by typing standard ubuntu and pip commands ( such as apt install package_name and pip install package_name) inside lab terminal, since you'll have the root privileges.

Have fun and note that this is an ongoing experimental notebook that might have some bugs; but will definetely improved as it's used by many.