If you haven’t heard about the latest innovation to power cloud technology, get ready for an important lesson.
If you’re not familiar with the terms Dockers and containers, then to start it might be worth revisiting how full machine virtualization works. Virtual machines (virtual versions of a device or a resource, like data or memory) are created by, and run on a piece of computer hardware, software or firmware called a hypervisor. The hypervisor is a host machine, which runs on a host operating system, which in turn runs on a host server.
VMs, IaaS and Amazon
Virtualization is important in the Infrastructure-as-a-Service (IaaS) industry, where clients rent cloud infrastructure (servers, storage and networking) on demand, in a pay-as-you-go model. Here, virtual machines run on physical servers. Data centres and clouds today, such as Amazon Web Services and Microsoft’s Azure, use this hypervisor system. However, the trouble with virtual machines, or VMs, is that each VM runs a full copy of the operating system to host an application, including all the components that are not needed. Obviously this results in significant memory, bandwidth and storage being unnecessarily used.
Containerization, on the other hand, is a lighter alternative to full machine virtualization. In reality, it is just another approach to virtualization, but one for which an entire VM does not need to be created. With containerization, there is no hypervisor. Instead, it is replaced with a container engine, which then supports bins and libraries with predefined resources. From there, applications can be compartmentalized by pulling only the resources you need to run that app.
Below is a helpful infographic that conceptualizes the difference between VMs and containers:
Where cloud technology plays into this concept of compartmentalization is that container engines can be coupled with a cloud hub, a cloud service for sharing applications and automating workflows. And the term “Dockers” comes from Docker, Inc., which has created a proprietary container engine (more on this to follow).
Put simply, container engines allow applications to be packaged into the software equivalent of shipping containers, so they can be moved around, in networks and in the cloud, and run on any hardware. This can potentially save a data center or cloud provider millions of dollars annually in power and hardware costs. And, because resources can be provisioned minute-by-minute depending on demand, users don’t experience slowdown in performance when demand increases, nor do they end up overpaying for unused resources upon demand decrease.
Phil Waineright, writing for Diginomica, states that this technology “takes IT automation to a whole new level.” He adds: “People commonly report improvements in application density of 10x to 100x or more per physical server, brought about by a combination of the compactness of the containers and the speed with which they are deployed and removed.”
For those about to Docker, we salute you
Despite the hype surrounding it, the idea of containers is not new. In fact, containers have been used by some Platform-as-a-Service providers since their organizations’ inception. What’s causing the recent excitement is that Docker Inc., the creators of the open source Docker container project, released Docker 1.0 and the Docker Enterprise Support program. Where before, significant expertise and specialized coding was needed to use this technology, this release means enterprises can now use this technology independently, at a fraction of the cost. And companies are rushing to adopt the technology in large numbers.
Jay Lyman, senior analyst at 451 Research, explains that “Enterprise organizations are seeking and sometimes struggling to make applications and workloads more portable and distributed in an effective, standardized and repeatable way … Docker Hub [Docker’s proprietary cloud hub], Official Repos and commercial support are helping enterprises answer this challenge by improving the way they package, deploy and manage applications.”
Groupon, a deal-of-the-day website company, has started using Docker’s version of the technology as the foundation of its build-and-release pipelines, and are excited by the results. CTO of Platform Brian McCallister says, “[I]t offers huge benefits around standardization and repeatable processes, especially for a company like Groupon with such a diverse set of technologies in play. The reliability of the platform is critical, and Docker provides the best, most easily managed tool for packaging and deploying services.”
Admittedly, there are drawbacks to containers, and one of them is that they cannot create virtual machines that use different operating systems the way hypervisors can. However, you can add a lot more containers, anywhere from two to five times more, depending on who you ask. As long as you don’t need the functionality of a hypervisor in your business, then containers make better use of your server’s hardware.
Steven J. Vaughan-Nichols, writing for IT World, sums up the potential of Docker and container technology, especially where cloud computing at the enterprise level is concerned. He writes, “At day’s end, containers are not important for their technology, they’re important because they can substantially help your bottom line.”
Docker logo via Docker Inc
Latest posts by (see all)
- 5 things law marketing can learn from the Kansas City Royals - November 6, 2014
- Harmony@Work wants to help companies pay better attention to (and change) the way they tackle diversity, equity and inclusion - July 19, 2018
- Apple Store designer Tim Kobe explains the ‘return on experience’ companies should build into everything they do - July 18, 2018
- Barracuda Networks TV commercial takes a swing at winning over the C-Suite with their favorite sport - July 17, 2018