A couple of weeks ago, we attended DockerCon, the inaugural Docker-centric conference for developers, and anyone else, with an interest in the open platform to build, ship, and run distributed applications, whether on laptops, data center VMs, or the cloud. We were there, not only as a founding System Integration partner, but also as a presenter.
Our CEO Aater Suleman gave a joint presentation with one of our customers. While the conference was interesting, I felt it was too driven by implementation and not focused enough on use cases. So, in this post, I thought I’d introduce and share some use cases for Docker.
Before we can talk about how Docker can be used, let’s see what special capabilities this tool brings to the show.
Docker provides light-weight virtualization, with almost zero overhead. The effects of this deliver some impactful advantages.
The first is that you can get the benefits of an extra layer of abstraction offered by Docker without having to worry about the overhead. The second advantage is that you can have many more containers running on a single machine than you can do with virtualization.
Another powerful impact is that container bringup and bringdown can be accomplished within seconds. Solomon Hykes, founder of Docker Inc., has a great overview of what Docker adds to plain LXCs here.
Here are just some of the use cases that with the enabling technology of Docker provide a consistent environment at a very low overhead.
This is the primary use case Docker Inc. advocates. One of the big advantages of VMs is the ability to run any platform with its own config on top of your infrastructure. Docker provides this same capability without the overhead of a VM. It lets you put your environment and configuration into code and deploy it. The same Docker configuration can be used in a variety of environments. This decouples infrastructure requirements from the application environment.
Code Pipeline Management
The previous use case makes a large impact in managing the code pipeline. As the code travels from the developer’s machine to production, there are many different environments it has to go through to get there. Each of these may have minor differences. Docker provides a consistent environment to the application from dev through production, easing the code development and deployment pipeline.
This leads to some additional advantages Docker has for a dev productivity use case. For a detailed example of this, you can check out Aater’s talk from DevOpsDays Austin 2014 or DockerCon.
In a developer environment, we have two goals that are at odds with each other. We want it be as close as possible to production, and we want the development environment to be as fast as possible for interactive use.
Ideally, to achieve the first goal, we need to have every service running on its own VM to reflect how the production application runs. However, we don’t want to always require an Internet connection and add in the overhead of working remotely every time a compilation is needed. This is where the zero-overhead of Docker comes in handy. A development environment usually has a very low memory. And by not adding to the memory footprint that’s commonly done when using a VM, Docker easily allows a few dozen services to run within Docker.
There may be many reasons for which you end up running multiple applications on the same machine. An example of this is the developer productivity flow described earlier. But there are other cases, too.
A couple to consider are server consolidation for decreasing cost or a gradual plan to separate a monolithic application into decoupled pieces. For understanding why it is important to create decoupled applications, read this essay by Steve Yege, who compares Google and Amazon.
Just like using VMs for consolidating multiple applications, the application isolation abilities of Docker allow consolidating multiple servers to save on cost. However, without the memory footprint of multiple OSes and the ability to share unused memory across the instances, Docker provides far denser server consolidation than you can get with VMs.
Docker provides a lot of tools that are not necessarily specific to containers, but, they work well with the concept of containers. They also provide extremely useful functionality. This includes the ability to checkpoint and version containers, as well as to diff two containers. This can be immensely useful in fixing an application. You can find an example of this in our “Docker Saves the Day” post.
Yet another interesting use case of Docker is its use in multi-tenant applications, thereby avoiding major application rewrites. Our very own multi-tenant example is to develop quick and easy multi-tenancy for an IoT application. Code Bases for such multi-tenant applications are far more complicated, rigid and pretty much difficult to handle. Rearchitecting an application is not only time consuming, but also costs a lot of money.
Using Docker, it was easy and inexpensive to create isolated environments for running multiple instances of app tiers for each tenant. This was possible given the spin up speed of Docker environments and it’s effective diff command.
You can learn more about this use case here.
Before VMs, bringing up a new hardware resource took days. Virtualization brought this number down to minutes. Docker, by creating just a container for the process and not booting up an OS, brings it down to seconds. This is the enabling technology that has had Google and Facebook using containers.
You can create and destroy resources in your data center without worrying about the cost of bringing it up again. With typical data center utilization at 30%, it is not difficult to bump up that number by using a more aggressive allocation of resources. And, the low cost of bringing up a new instance allows for a more aggressive allocation of resources.
There are many more documented use cases that show Docker as an appropriate solution. These span the spectrum from security to developer empowerment. However, this post is already getting too long.
At Flux7, we have completed several projects using Docker. You can explore our case studies about some of these projects here.
For us, the motivation to use Docker is always about using the right tool for the job. The most interesting part is the business case, then we build from there.
As DevOps gains hold and more organizations explore Docker, microservices is emerging as a logical next step.
Get Started with AWS
Receive AWS tips, DevOps best practices, news analysis, commentary and more. Sign up for our IT Modernization blog here and set your topic and frequency preferences. Or, download our guide on getting started with AWS, establishing a secure AWS enterprise architecture with Flux7 Landing Zones.