Best Practices, Use Cases, Tutorials
“Build once, configure once and run anywhere….” That’s Docker summarized in one line. When trying to choose from the great number of available software application-development technologies, a key concern is the ability to deploy applications typically in any environment without overhead. Deploying services or applications across multiple environments can lead to conflicting communications between services, so one must address issues such as quick migration, scaling and performance.
Shortcomings of Virtual Machines (VMs):
Up to now, virtual machines have been the go-to method for packaging and distributing applications across various environments. They can be used to create isolated environments, to package them using tools such as Vagrant, and to ship them where needed. However, VMs have their shortcomings:
1. A VM’s size can grow very large when trying to handle all of the required dependencies and packages.
2. They are resource intensive because they consume a great deal of CPU and memory. A complex scenario, like scaling an application for multiple providers, can result in more complexities, such as running out of disk space.
3. From a developer’s perspective, their tools for building and testing applications are limited.
4. They produce significant performance overhead, especially when performing IO operations.
[Tweet “Why #Docker Instead of Linux Containers (LXCs)?”]
LXCs are lightweight and allow one to run multiple isolated instances on the same host. They share a single kernel, but can have a set definition for the number of resources they can consume. LXCs allow secure running of isolated instances absent interference among those instances.
Docker is not a replacement for LXCs, but rather an added layer to make them easier to use for more complex operations. Docker distinguishes itself from LXCs in several ways:
1. It helps version tracking with complete traceability from production-server states back to the actual container developer.
2. It helps to avoid dependency hell by providing complete isolation of resources, network and content.
3. It builds easily-shareable containers incrementally.
4. It supports an image-sharing ecosystem.
5. Its container reusability allows one to create more specialized components.
6. It uses operating system primitives, so is hardware agnostic in exhibiting consistent runs on any hardware.
7. It provides a distinct line of separation of duties that makes life easier for developers and ops guys.
Docker Use Cases
Docker makes possible several interesting use cases, as specified at Docker’s website. For example, you can:
1. Build your own PaaS.
2. Create a web-based environment for instruction.
3. Deploy applications easily.
4. Create secure sandboxes.
5. Automate application deployment.
6. Create lightweight desktop virtualization.
7. Reproduce environments between dev, production, QA, etc..
Check out our previous post, “Using Docker for Quick And Easy Multi-Tenancy,” to read how Flux7 Labs experimented to successfully design its own Docker use case.
Also, check back soon for our next post on Docker’s value within the developer environment.