Docker Containers – Part 1 Introduction and Deployment

Docker Containers – Part 2 Storing and Distributing Docker Images
October 14, 2016

If you haven’t heard of Docker yet?

Docker is becoming buzz word for virtualization over the past year. You might have heard of Dockers many places while following up trends on the cloud, virtualization, containerizing applications and many other trending topics.

If you haven’t heard of it yet, no worries. Let’s have a brief about what Docker Container is? And how you can get it on the well-known cloud infrastructures?

Docker is an open platform for application development, running and packaging based on Linux containers takes virtualization to the next level over virtual machines. Many organization are adopting cloud based application deployment and leaving behind traditional application development and deployment platforms. Docker Container provides easier way to build and deploy your business critical applications within totally isolated containers. These Linux-based Docker Containers enables capabilities to kernel containment rather than simulating and running a separate kernel with every virtual machine. It also completely isolates your application data and computing on the shared resources.

Let’s put some light on how Docker works

Docker works as a server-client model. Docker daemon doesn’t not allow users to directly communicate, users need to use Docker clients to interact with Docker daemon. Docker daemon allows you to deploy your application within separate containers. Docker allows you to create new and update your existing Docker images for your application. These images are read-only templates could consist of Ubuntu Linux operating system with Apache and also your application with complete configuration. Docker provides Docker Registry that can be configured as private or public store to upload or download your Docker Images.

Docker also allows you to simplify creating multi-container applications. With Docker you can define multiple Docker Containers within the single file and it will allows you to build your multi-container application with a single command.

Now you can take Dockers to the next level by using it on the cloud

Now you can deploy your applications on the cloud and scale your application to meet ever changing user demand. Today many cloud providers are allowing to host your application within Document Containers including Microsoft Azure, Amazon Web Services, Google, Joyent, IBM and Rackspace cloud.

Let’s have a look into Docker deployment some common cloud infrastructure for Docker deployment.

Docker on Azure Cloud

Microsoft Azure allows you to deploy Docker on the Azure with Azure Docker VM Extension. Azure Linux Agent manages Azure Docker to create a Docker VM to build, run, test, deploy and share your application in the Docker Containers. Azure allow two ways to deploy Docker-based Linux VM on Azure:

  1. Configure Docker VM Extension from Azure Cross-Platform Interface
  2. Configure Docker VM Extension with the Azure Portal

Here we will show how to configure Docker VM Extension to create own private PaaS with the Azure Portal. If you want to know more about Azure Cross-Platform interface for configuration, visit here.

Configure Docker Virtual Machine

  1. To start with Docker, first you need to create a new Azure Virtual Machine with Ubuntu 14.04
    [LTS from the Image Gallery or you can use your existing Azure Virtual Machine of Ubuntu 14.04 LTS.]
  2. For Docker communication with HTTPS, create new certificate and key file and place files in ~/.docker directory at client compute.
  3. Add Docker VM Extension to the Azure Virtual Machine
    1. Select specific Virtual Machine
    2. Scroll down to Extensions to get list of VM extensions
    3. Add Docker VM Extension
    4. Add Certificate and Key file
  4. Define endpoints for Docker Communication by defining protocol, public port and private port
  5. Finally test your Docker Client and Azure Docker Host with the following command:

docker –tls -H tcp://dockerextension.cloudapp.net:2376 info

Note: You should replace “dockerextension” with the subdomain for your Azure Virtual Machine

  1. Now you are ready to deploy your application in the Docker Conatiner.

Docker on Amazon Web Services

Using Dockers on Amazon Web Services Cloud allows you to build, run, test, deploy and share your Linux Container based application. Amazon EC2 Container Service also known as Amazon ECS allows to launch and stop container-based applications using API calls also gives visibility into state of your application cluster with centralized service.

Following are steps to install Docker on an Amazon Linux Instance

Configure Docker Instance

  1. Provision new Amazon Instance with Amazon Linux machine image.
  2. Once you are done with the Amazon Instance provisioning, connect your instance using SSH. For more details, visit here.
  3. Now update installed packages and package cache of your Amazon Linux instance. With the following command

sudo yum update –y

  1. Install Docker. Amazon ECS recommends Docker version 1.7.1 but minimum requirement is Docker version 1.5.0

sudo yum install -y docker

  1. After successful Docker installation, now start service for Docker

sudo service docker start

  1. To perform Docker commands without using sudo, add EC2 users to the Docker group

sudo usermod -a -G docker ec2-user

  1. Apply your new permissions of Docker group, log out and login again.
  2. Verify Docker commands by EC2 users without sudo

docker info

  1. Now you are ready to build your application in the Docker Container

Package your application and store into Docker Container Images

Once you have successfully installed Docker engine, now let’s discuss how to create new image for your application into Docker Container Images. To explore this scenario, we are taking Django application (which is a free and open source web application framework, written in Python). Following are high level steps to package your application into Docker Container Images:

  1. To start with, first create a new directory to copy our application source and will use this directory to make container and subdirectory for Docker Image which we want to create.

$ mkdir -p dockyard/hello_django_docker

$ cd dockyard/hello_django_docker

$ git clone

NOTE: To show steps for copying our application source into newly created directory, we have uploaded application source file on Github.

  1. Once you have new working directory containing the source code of your application, proceed further to create Docker-related files
  2. Now create an entry-point script. Docker Container will use steps mentioned in this script as default commands to be run when container is running

Note: Save your script as docker-entrypoint.sh and define permissions for executing this file.

$ chmod u+x docker-entrypoint.sh

  1. Create the Dockerfile. Dockerfile will contain definition for container. This definition includes steps needed to create Docker Image for your application.

NOTE: This file will include information about OS for your application image, environment variables, system tools and libraries, etc.

  1. Build the Docker container image for your application. For creating Docker container, naming convention should be user name/image-name. With this convention, your image will be directly added to user’s account when you upload your image to repository.

$ docker build -t user/hello_django ~/dockyard/hello_django_docker

  1. Finally cross check newly created image in the image repository.

docker images

In the next part of this article series, we will talk about options available for storing and distributing Docker image.

Stay tuned!!!