Jupyter (formerly IPython Notebook) is an open-source project that lets you easily combine Markdown text and executable Python source code on one canvas called a notebook. Visual Studio Code supports working with Jupyter Notebooks natively, as well as through Python code files.
Docker uses containers tocreate virtual environments that isolate a TensorFlow installation from the restof the system. TensorFlow programs are run within this virtual environment thatcan share resources with its host machine (access directories, use the GPU,connect to the Internet, etc.). TheTensorFlow Docker images are tested for each release.
- Jan 28, 2021 Docker uses containers to create virtual environments that isolate a TensorFlow installation from the rest of the system. TensorFlow programs are run within this virtual environment that can share resources with its host machine (access directories, use the GPU, connect to the Internet, etc.).
- First of all make sure that you have installed and/or upgraded Jupyter-notebook (also for virtual-environment): pip install -upgrade jupyter 2. Change the Access Permissions (Use with Caution!) then try to change the access permission for you. Sudo chmod -R 777 /.local where 777 is a three-digit representation of the access permission.
Docker is the easiest way to enable TensorFlow GPU support on Linux since only theNVIDIA® GPU driver is required on the host machine (the NVIDIA® CUDA® Toolkit does not need tobe installed).
TensorFlow Docker requirements
- Install Docker onyour local host machine.
- For GPU support on Linux, install NVIDIA Docker support.
- Take note of your Docker version with
docker -v
. Versions earlier than 19.03 require nvidia-docker2 and the--runtime=nvidia
flag. On versions including and after 19.03, you will use thenvidia-container-toolkit
package and the--gpus all
flag. Both options are documented on the page linked above.
- Take note of your Docker version with
docker
command without sudo
, create the docker
group andadd your user. For details, see thepost-installation steps for Linux.Download a TensorFlow Docker image
The official TensorFlow Docker images are located in the tensorflow/tensorflow Docker Hub repository. Image releases are tagged using the following format:
Tag | Description |
---|---|
latest | The latest release of TensorFlow CPU binary image. Default. |
nightly | Nightly builds of the TensorFlow image. (Unstable.) |
version | Specify the version of the TensorFlow binary image, for example: 2.1.0 |
devel | Nightly builds of a TensorFlow master development environment. Includes TensorFlow source code. |
custom-op | Special experimental image for developing TF custom ops. More info here. |
Each base tag has variants that add or change functionality:
Tag Variants | Description |
---|---|
tag -gpu | The specified tag release with GPU support. (See below) |
tag -jupyter | The specified tag release with Jupyter (includes TensorFlow tutorial notebooks) |
You can use multiple variants at once. For example, the following downloadsTensorFlow release images to your machine:
Start a TensorFlow Docker container
To start a TensorFlow-configured container, use the following command form:
For details, see the docker run reference.
Examples using CPU-only images
Let's verify the TensorFlow installation using the latest
tagged image. Dockerdownloads a new TensorFlow image the first time it is run:
Let's demonstrate some more TensorFlow Docker recipes. Start a bash
shellsession within a TensorFlow-configured container:
Within the container, you can start a python
session and import TensorFlow.
To run a TensorFlow program developed on the host machine within a container,mount the host directory and change the container's working directory(-v hostDir:containerDir -w workDir
):
Permission issues can arise when files created within a container are exposed tothe host. It's usually best to edit files on the host system.
Start a Jupyter Notebook server usingTensorFlow's nightly build:
Follow the instructions and open the URL in your host web browser:http://127.0.0.1:8888/?token=...
GPU support
Docker is the easiest way to run TensorFlow on a GPU since the host machineonly requires the NVIDIA® driver (the NVIDIA® CUDA® Toolkit is not required).
Install the Nvidia Container Toolkit to add NVIDIA® GPU support to Docker. nvidia-container-runtime
is onlyavailable for Linux. See the nvidia-container-runtime
platform support FAQ for details.
Check if a GPU is available:
Verify your nvidia-docker
installation:
nvidia-docker
v2 uses --runtime=nvidia
instead of --gpus all
. nvidia-docker
v1 uses the nvidia-docker
alias, rather than the --runtime=nvidia
or --gpus all
command line flags.Examples using GPU-enabled images
Download and run a GPU-enabled TensorFlow image (may take a few minutes):
It can take a while to set up the GPU-enabled image. If repeatedly runningGPU-based scripts, you can use docker exec
to reuse a container.
Use the latest TensorFlow GPU image to start a bash
shell session in the container:
Jupyter Notebook is a powerful tool, but how can you use it in all its glory on a server? In this tutorial you will see how to set up Jupyter notebook on a server like Digital Ocean, AWS or most other hosting provider available. Additionally, you will see how to use Jupyter notebooks over SSH tunneling or SSL with with Let’s Encrypt.
Jupyter is an open source web application that enables interactive computing from the browser. You can create documents that feature live code, documentation with Markdown, equations, visualization and even widgets and other interesting capabilities. Jupyter comes from the three core languages that are supported: Julia, Python, and R. Jupyter connects to a kernel with a specific language, the most common being the IPython kernel. It supports a whole variety of kernels and you should find most languages you need. This tutorial was written in JupyterLab, the next developments of Jupyter notebook:
In this tutorial we will be working with Ubuntu 16.04/18.04 servers, but most steps should be fairly similar for Debian 8/9 distributions. We will first go through creating SSH keys, adding a new user on the server, and installing Python and Jupyter with Anaconda. Next, you will setup Jupyter to run on the server. Finally, you can either choose to run Jupyter notebooks over SSH tunneling or over SSL with Let’s Encrypt.
We are starting with a fresh server and in order to add more security when accessing your server, you should consider using SSH key pairs. These key pairs consist of a public key which is uploaded to the server and a private key that stays on your machine. Some hosting providers require you to upload the public key before creating the server instance. To create a new SSH key you can use the ssh-keygen tool. To create the key pairs you can simply type the command:
this will prompt you to add the file path and a passphrase if you want to. There are other arguments of options you can choose from like public key algorithm or file name. You can find a very good tutorial here on how to create a new SSH key with ssh-keygen for Linux or macOS. If you are using Windows, you can create SSH-keys with PuTTYgen as described here. If your hosting provider does not need a public key before creation you can copy the public key with the ssh-copy-id tool:
Finally, you can connect to your server with:
where ~/.ssh/id_rsa
is the path to your ssh private key and host
is the host address or IP address of you server instance.
In some servers you start off as a root user. It is considered bad practice to work directly with the root since it has a lot of privileges which can be destructive if some commands are done by accident. If you already have a user you can skip this section. Note that you can replace cloud-user
in all the following commands with the user name you want. Start by creating a new user:
This command will ask you a couple of questions including a password. Next, you’ll want to grant administrative privileges to this user. You can do this by typing
Now you are ready to switch to the new user with su cloud-user
or by connecting to your server with ssh cloud-user@host
. Optionally, you can add the SSH keys of the root user to the new user for additional security. Otherwise you can skip to the next section on how to install Anaconda. Now, if you you have existing SSH keys for the root user you can copy the public key from the root home folder to the users home folder like shown here:
Next, you need to change the permissions for both the folder and the public key:
If you are using a password for your user you need to update /etc/ssh/sshd_config
:
There you want to find the line PasswordAuthentication no
and change the no
to a yes
to allow password authentication. Finally you want to restart the SSH service by typing service ssh restart
. For other distributions have a look into this guide, where you will also see how to set up a firewall.
Anaconda is an open-source distribution of Python (and R) for scientific computing including package management and deployment. With it, you have most tooling that you need including Jupyter. To install Anaconda, go to the downloads for linux and copy the Linux installer link for the latest Python 3.x version. Then you can download the installer with wget
:
Next you can install Anaconda by using bash
as follows:
During installation, it is important to type yes
when the following prompt appears during the installation:
After you finished installing you want to initialize the conda
command line tool and package manager by Anaconda with:
These two commands set up Anaconda on your server. If you have run the Anaconda bash file with sudo, you will get a Permission denied
error. You can solve it as shown in this question by typing sudo chown -R $$USER:$$USER /home/user/anaconda3
. This changes the owner of this folder to the current user with the chown command.
Jupyter is installed with Anaconda, but we need to do some configuration in order to run it on the server. First, you’ll want to create a password for Jupyter notebook. You can do this by starting the IPython shell with ipython
and generating a password hash:
Save this resulting hash for now, we will need it in a moment. Next, you want to generate a configuration file which you can create by typing.
Now open the configuration file with sudo nano ~/.jupyter/jupyter_notebook_config.py
and copy the following code into the file and replace the hash in this snippet with the one you have previously generated:
Now you should be set up. Next, you can decide whether you want to use SSH tunneling or you want to use SSL encryption and access your jupyter notebook over your own domain name.
You can tunnel to your server by adding the -L
argument to the ssh
command, which is responsible for port forwarding. The first 8888
is the port you will access on your local machine (if you already use this port for another juypter instance you can use port 8889 or a different open port). You can access this then on your browser with localhost:8888
. The second part localhost:8888
specifies the jump server address accessed from the server. Since we want to run the notebook locally on the server, this is again localhost. This would mean that we access localhost:8888
from the server via port forwarding to localhost:8888
on our machine. Here is how the command would look like:
If you have another Jupyter notebook running on your local machine already you can change the port to e.g. 8889
which would result in the command:
Now, you can create a notebook folder for your projects on the server and run Jupyter notebook inside:
You can also use JupyterLab instead, which is a more powerful interface and it comes also pre-installed with Anaconda. You can start it by typing jupyter-lab
instead of juypter-notebook
.
It is also possible to use SSL encryption for your jupyter notebook. This enables you to access your Jupyter notebooks through the internet which makes it handy to share results with your colleagues. To do this you can use Let’s Encrypt, which is a free Certificate Authority (CA) that provides an easy way for TLS/SSL certificates. This can be done fully automated with their certbot tool. To find the installation guide for your system have a look at this list. For Ubuntu 18.04 the installation looks as follows:
Now, you can run certbot for the domain that you have:
Docker Install Jupyter Notebook
After going through the prompts, you should get to this output:
Great! You have your certificate and key file ready. Now you can use the certificate and key file in your jupyter notebook configuration file. Before you can do that, you need to change the owner of the certificate and key file with (change user
with your own user name):
Next, you can add the following code to the ~/.jupyter/jupyter_notebook_config.py
configuration file:
Finally, you can access Jupyter notebooks securely over https://example.com:8888
. Just make sure to use https://
instead of http://
. If you made any mistakes, you can delete the certbot certificate with sudo certbot delete
or sudo certbot delete --cert-name example.com
. If you are using a firewall, make sure that port 8888
is open. Here is a good guide on using the Uncomplicated Firewall (UFW) firewall.
You have learned how to set up Jupyter for a server from start to finish. This is a task that gets easier with every server set up that you do. Make sure to delve into the surrounding topics of Linux server administration since working with servers can be intimidating in the beginning. Using Jupyter you have access to a wide variety of kernels that enable you to use other languages. A list of all available kernels can be found here. I hope this was helpful and if you have any further questions or remarks, feel free to share them in the comments bellow.
I covered in a previous tutorial how to work with virtual environments in Jupyter notebook. There is also an option to run Jupyter as a Docker container. You can use for example the jupyter/datascience-notebook container. You can read more on how to work with Jupyter and Docker in this guide. For further security considerations have a look at Security in the Jupyter notebook server. Here are further links that I have learned from and that might be useful for you too:
Image from Wikimedia CommonsPlease enable JavaScript to view the comments powered by Disqus.