There are two things that have stayed consistent between the container conversations I had back in 2013 and the ones I hear today: containers still aren’t ready for widespread enterprise use, and the industry is still wildly excited any time a container conversation pops up.
I had this experience recently when I mentioned containers within a broader talk about enabling data access from anywhere, with business continuity, disaster recovery and metro-level high-availability. As I flipped to a slide in my presentation that showcased the potential for extending this access to containers, my audience got very interested. I’m often surprised when this happens, because it seems we all must know that container adoption hasn’t substantively progressed during the last four years – at least not at the enterprise level, and not beyond using containers as playthings.
There are a few reasons why this industry continues to be so enthusiastic about containers, and why we should all calm down a bit – at least for now.
- Containers have the potential to solve really difficult challenges...
A software container platform like Docker promises to make it easy to deploy apps with greater consistency when you move them from one computing environment to another, like from test to staging to production, or from on-prem to the cloud. The idea is that the container holds the whole runtime environment, not just the application, but also everything needed to run it in one tidy package. You can snap it together like a jigsaw puzzle, and then deploy the container without having to create virtual machines, passwords and usernames. That easy operational model is appealing.
- But containers have historically been storage-lite...
The jigsaw puzzle enterprises need to build comes with a lot of data storage underneath, but container technology has not tackled the storage integration issue yet. This is one reason why Docker hasn’t yet turned into the virtualization killer people thought it would become. The good news is that Docker is starting to realize companies can’t use this technology for apps with a lot of data, and that realization is a step in the right direction.
- And enterprises are wary of open source, in general.
There’s an instructive parallel between Docker and Linux. It took 20 years for enterprises to adopt Linux. They didn’t trust it because it was free, there was no one to support it and if something didn’t work, they’d need to have expertise on hand to dig into the source code. For many years, large businesses viewed Linux as a toy, one designed by random coders all over the world. It took some big players to bring Linux to the enterprise in a widespread way.
The people who have been using Docker over the past few years are happy with it. They don’t pay for it, and they can take advantage of its features for apps that aren’t burdened by large data footprints to manage. For containers to move beyond this group of advocates into the mainstream, two things need to happen.
First, we need to see a solution to the question of where data storage will go – inside the container or out. This was a religious-level argument for a long time, but it seems to be getting settled. It looks like storage will go outside containers. Second, the market is waiting for traditional enterprise vendors to jump into the fray. Some of this is happening up the stack now, with EMC donating a lot of code for the orchestration community for containers. It will take this level of contribution from players like Dell/EMC, HP and others before containers can fully transcend the boundary between being an open source novelty to an enterprise platform.
Until then, the industry’s enthusiasm for containers is more about aspiration than actuality.
Keep up with trends in data storage. Tune into our live webinar, “Get out of the box: what you’ll need to know next about storage.”