Skip to content

Using GPU docker for tensorflow on DEVPHI server

fialhocoelho edited this page Dec 19, 2018 · 3 revisions

!!! DEPRECATED !!!

What is Docker?

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package.

We installed on the DEVPHI server a Docker with the NVIDIA configurations recommended for the tensorflow-GPU.

Next, the instructions for using the docker on the DEVPHI server.

login the DEVPHI server

$ ssh -X [email protected]

Accessing your files in the Docker

To access your files inside the docker, they just need to be inside the directory /data of any server in the manycore cluster.

path of /data inside the Docker:

/notebooks/data/

RUN jupyter notebook inside the Docker:

$ docker run --runtime=nvidia -it -p 8888:8888 -v /data:/notebooks/data/ tensorflow/tensorflow:latest-gpu

The jupyter will show the session token.

After that, you can access the notebook using the browser:

http://devphi.ncc.unesp.br:8888/?token=your_session_token_here

RUN bash inside the Docker:

$ docker run --runtime=nvidia -it -v /data:/notebooks/data/ tensorflow/tensorflow:latest-gpu bash

To exit the Docker:

$ exit