Docker

Deploying applications on Virtual Machines (VMs) compared to physical machines seems a good option. But like the physical machines, the same software needs to be installed on a Virtual Machine. In simple words, the application that was hosted on a physical machine now can be hosted on a virtual server, but still, you need to install a separate OS to run a specific application. Another problem is that VMs are not lightweight, hence only 2-3 VMs can be run on a high-performing server.

As we all know, to host applications on Clouds, you need to pay for hosting services. The more space, the more fees. With VMs the same problem arises. It increases the infrastructure cost, scaling becomes a problem, and on top of that, it’s not easy to deploy the application on these VMs unless you switch to Docker.

What Is Docker and How Does it Work?

It’s a tool that helps developers automate the deployment of applications using lightweight containers, thus the application can work effectively in distinct environments. Docker packages an application along with its dependencies in the form of containers and ensures that the application run in any environment.

To understand Docker’s Workflow, you need to understand the Docker engine and its components:

  1. Docker Daemon: It listens and executes the Docker API requests. Plus, it manages Docker images, containers, networks, and storage volume constantly in the background.
  2. Docker Engine REST API: An HTTP client accesses this API thus an application can communicate with the Docker daemon.
  3. Docker CLI: This command-line interface interacts with the Docker daemon, and simplifies container management.

Docker works on a client-server architecture. This architecture consists of the Docker client, Docker host, and Docker registry. The Docker client is used for activating Docker commands; Docker host for running Docker daemon; and Docker registry for storing Docker images.

Using a REST API, the Docker client and Docker daemon communicate, also it builds, runs, and distributes containers - defined by its image and configuration options provided at the time of its creation. By default, these containers remain isolated from the other containers and their host. However, the extent of isolation can be controlled based on the requirement of the user.

Virtual Machines Vs Docker Containers

Basis of comparison

Virtual Machine

Docker Containers

Operating System Usage

Guest and host operating systems (OS) are required

Only the host operating system is needed.

Application Processing

Slower application processing due to an extra layer of guest OS.

Containers use common libraries of the host OS; hence execute applications faster.

Booting Speed

Virtual machines are heavy thus take more time to boot up.

Lightweight containers are faster and require a fraction of a second to boot up.

How Docker Helped Uber?

Besides commuting services, Uber offers a range of localized services like UberPOOL, UberKITTENS, UberIceCream, and UberEATS and new features are added to each service almost every day. But back in 2015, it wasn’t easy. For instance, every week Uber’s in-house cluster management system, uDeploy was managing:

  • 4000 upgrades
  • 3000 builds
  • 300 rollbacks
  • 600 services in the system

Keeping the application up 24/7 with a small team and making all the changes was a real challenge. It resulted in huge dependencies among the teams until Docker was introduced. Instead of shifting from uDeploy to Docker, both platforms were deployed together and the results were completely shocking. With Docker, Uber’s team had not to:

  • Rely on the infrastructure team to write scaffolding.
  • Rely on IT to locate services.
  • Rely on the infrastructure team to provide services.

In simple words, tasks that were earlier used to take hours or weeks could take 10 minutes or so. Plus, Docker reduced team dependencies and provided the freedom that is crucial for any business to grow.

Role of Docker in DevOps

Docker is known for continuous development, integration, and deployment. And that’s the reason this methodology relies on automation tools. Since Docker is a tool and favors both the operation team and development team, it plays a significant role in DevOps. After incorporating Docker, developers can focus on writing codes without worrying about the environment where the code will run. On the other hand, operations staff do not have to worry about providing systems as it already reduces the number of required systems.

Conclusion

Shifting from one methodology to another can provide fruitful results only when the teams are equipped with the right tools, and Docker is one of them. Be it faster and speedier deployment or ensuring security and scalability, it has all. Besides, it brings freedom and reduces dependencies.