Lately, it seems like everyone just can’t stop talking about containers. But I’m sensing a distinct lack of real understanding of the technology from many people, not to mention lots of confusion about what containers really mean for today’s datacenter folks. So I set about learning more and figuring out for myself what the deal is with containers. Here’s where I’m at.
Containers Aren’t The New Virtualization
To hear some folks tell it, containers generally and Docker specifically is “virtualization done right”, bringing the benefits without the drawbacks. But this is laughably off, in my opinion. Although container technology might have some elements in common with hypervisor-based server virtualization, the two technologies are radically different in intent and in practice.
Let’s go back to the original intent: Containers and similar technologies were meant to create a consistent and contained environment in which an application could be run. This is the key concept, and what makes a container radically different from a virtual machine.
When you create a container, you are specifying which operating version and associated services and utilities are available to the application you wish to run rather than running it “wide open” on a full OS. With Docker, all this is specified using cryptographic hashing technology so you can be sure that this environment is exactly the way you want it to be whenever you start the application.
This gives the developer much more control and reassurance than they have ever had before. Most software issues (especially when it comes to security and performance) are due to unrelated utilities and services running alongside an application. By packaging an application in a container, many of these issues vanish. Whether the application is run on a laptop, in a datacenter, or in the cloud, the local environment is exactly the same.
Notice that I never once talked about hardware utilization or even about sharing hardware. Those are things that both virtualization and containerization deliver, but it’s not the point of containers. It’s simply a side effect of how this consistent application environment is created. Since containers isolate application environments, we can run multiple containers on the same hardware. But a one-container-per-server install is just as valid!
Check out the Docker videos from Cloud Field Day and Tech Field Day Extra for lots more info!
Containers Are About Applications
Because containers specify an environment in which to run an application, they are truly transformative in practice. As mentioned, a developer can be sure that her application will run in exactly the environment she specifies (at least from the start). And she can leverage the work of other developers with confidence that their applications will work in a consistent and predictable manner, too.
Historically, too much developer time has been wasted “fighting” with application installs. If an application needs a database back-end or web server the developer has to learn how properly to install and configure that software before they can begin to work! This has really held back development, since it’s easier just to use “old faithful” software than risk fighting with a new cool database, even if it’s technically a better fit.
This is why developers love containers so much. They take away the old IT operations issues (compatibility, configuration) and allow the developer to get on with the coding. This is especially true with Docker, since it includes a marvelous library of containers to draw on. It’s literally a one-liner to download, configure, and run any of a huge number of application components.
Docker is a Gift for IT Operations
Because of this ease of use aspect, IT is beginning to embrace container technology, and especially Docker. Typically, IT operations folks are a little nervous about new technologies, and this is especially true of developer-focused tech. But once they try Docker, a light goes on!
This is what happened for me: I dithered about trying Docker until my friends pushed me to give it a look. Then I mucked about with it, unsure of the value. Then I typed docker-compose up -d
and the light finally went off. Install Docker compose and run WordPress. I guarantee you’ll get it!
Once I saw how easy and (more importantly) predictable it is to run Linux applications in Docker containers, I never want to do it any other way. Can you imagine no longer having to fight just to get an application installed? Sure you still have to configure it to meet your needs, but so much of my life has been wasted just getting to that point!
I envision a future where everything we install is shipped in a Docker container or something similar. And it’s not just for Linux: Microsoft is serious about Docker, too! Can you imagine a world without DLL and registry incompatibilities? Where you’re no longer worried that installing an application is going to interfere with the functions of another? I sure can!
I’ll be presenting an expanded version of this article as a keynote at the DeltaWare Data Solutions Emerging Technology Summit on October 25. If you’re able to get to Edina, MN, please register to attend!
Stephen’s Stance
Containerized applications are a godsend for developers and IT operations alike. They promise to eliminate one of the biggest sources of IT headaches: Getting software up and running. And I haven’t even talked about the benefits of shared infrastructure, scalable applications, and “sandboxed” everything! We’ll get to that, but in the mean time, go try Docker for yourself. It’s transformational!
Leave a Reply