Thursday, January 24, 2019

TensorFlow Docker setup on Ubuntu

Introduction


To continue with series of my experiments on setting up TensorFlow development environment, in this post I will cover:

How to setup TensorFlow Docker Development Environment on the Ubuntu machine?
The is one step ahead of what I did to setup Tensorflow on the windows machine. Along side with the robust Visual studio code as IDE for development.
Following are the steps which I used for the quick setup:

STEP 1: ACQUIRE THE UBUNTU MACHINE

I quickly acquire the latest Ubuntu linux machine through my AWS account. One can use this quick tutorial from AWS documentation to “Launch Instance” of type Ubuntu



STEP 2: SETTING UP DOCKER

Once we have the machine, the next step is to quickly do the setup of docker. This require execution of the commands as mentioned in the tutorial: Get Docker CE for Ubuntu 

STEP 3: CREATE A DOCKER IMAGE

At the PWD create a file having name Dockerfile (without any extension), copy the below content and save the file.

FROM python:3.6
 RUN apt-get update -y
 RUN apt-get install -y git
 RUN apt-get install -y unzip
 update pip
 WORKDIR /remote
 VOLUME /remote
 ENV REPO ""
 RUN pip install pip --upgrade
 RUN pip install wheel
 RUN pip install -U pip virtualenv
 RUN virtualenv --system-site-packages -p python ./tensorflow && \
     sh ./tensorflow/bin/activate && \
     pip install --upgrade pip && \
     pip install --upgrade numpy && \
     pip install --upgrade scipy && \
     pip install --upgrade opencv-python && \
     pip install --upgrade matplotlib && \
     pip install --upgrade pandas && \
     pip install --upgrade sklearn && \
     pip install --upgrade scikit-image && \
     pip install --upgrade tensorflow && \
     pip install --upgrade keras && \
     pip list

Refer code in the repo link

STEP 4: Build the Docker Image

Once we have Dockerfile
$ sudo docker build -t di-ubuntu-py3-tf .

STEP 5: RUN THE DOCKER INSIDE THE DOCKER CONTAINER

Till now we have the build the TensorFlow docker environment inside the Ubuntu machine, next is to allocate the computation resources to this image. Hence let’s run this image inside the container using below command:

$ sudo docker run -i -t --rm -v $(pwd):/remote:rw di-py3-tf-base /bin/bash

The above command execute the TensorFlow docker image inside the container and various options used in the command has following capabilities – 


  • -t: flag assigns a pseudo-tty or terminal inside the new container
  • -i: flag allows you to make an interactive connection by grabbing the standard input (STDIN) of the container
  • –rm: flag automatically removes the container when the process exits
  • -d: Run the container as the daemon
  • -v: is a volume mounting HOST DIRECTORY on the ubuntu machine to the CONTAINER DIRECTORY as defined in docker image.


STEP 6: INSIDE THE CONTAINER

Once the container is started, it enables the fully capable TensorFlow development environment in the Ubuntu (linux) machine. The /remote directory which is also defined as the working directory for the container in Dockerfile is mapped to the /{pwd} on the host machine. This mapped volume will always persist on the host machine even if the container is terminated.


Conclusion

By quickly acquiring an Ubuntu machine  from AWS console and on top of it setting up the docker environment can help to run various Open Source Deep Learning docker images framework on the fly. Here, I demonstrated the process using my own custom configured docker image. This actually gives me lots of flexibility and control on my development environment.

No comments:

Post a Comment

Autoscaling: Azure HDInsight Cluster