In 2013, Docker was the “it” company. Docker made headlines for the critical role it played in bringing containers to the mainstream, and in many ways displaced PaaS as the hotness of the time (Heroku anyone?). Now, the company is back in the press with the introduction of a new model for Docker Desktop that requires larger organizations to buy a paid subscription for the tools. There’s been a vocal reaction to this announcement, one that reminds me of the important role Docker played in popularizing a model we know, love, and now use on a mainstream basis: containers.

Docker did not invent containers, but it made the technology accessible through an open source tool and reusable images. With Docker, developers could really and truly build software once and run it locally, or on a production server.

The fact that the Docker command line tool displaced years of sexy web interfaces is perhaps a commentary on what developers really want. But to really understand Docker’s impact, it’s important to go back to a time slightly before Docker container technology made its debut.

Looking for the Next Big Thing

By 2009, the value of using virtualization was well understood and it was widely deployed. Most organizations had already garnered the benefits of virtualization or had a roadmap to get there. The marketing machine was tired of virtualization. People were hungry for the next innovation in IT and software development. It came in the form of Heroku. In fact, PaaS in general and Heroku specifically became wildly popular. So much so that it looked like PaaS was going to take over the world.

At that time, Heroku was huge. You just go out to this portal and develop your apps and deliver them as a service? What’s not to like? Why wouldn’t you develop apps on Heroku?

As it turned out, there were a couple of good reasons not to use Heroku and PaaS platforms of its ilk. For example, applications built on Heroku were not portable; they were available only within Heroku. Developers had to work remotely on the PaaS platform if they wanted to collaborate. Unlike Netflix, it turns out, developers love to develop locally. If a developer wanted to work on their local box, they were still left to manually build the environment themselves.

In addition, although the Heroku model was extremely powerful if you used what was provided out of the box, it was complex behind the scenes. As soon as your team built something more complex than a simple web app, or they needed to customize the infrastructure for security or performance reasons, it became a difficult, very “real” engineering problem.

It was great… until it wasn’t. But in typical IT fashion, lots of people went all in before realizing that platforms like Heroku have their place but are not the right tool for every job.

The Docker difference

Containers, on the other hand, solved many of the challenges with PaaS, and Docker was the company that made developers, IT managers, and business managers see and understand that. In fact, when Docker came out, its value was staggeringly obvious: All the things that were hard on Heroku were easy with Docker, and all the things that were easy on Heroku were also easy with Docker. With Docker you could quickly and easily fire up a pre-built service, but you could also easily develop locally, and customize services to make them do what you need.

That’s not to say that Docker was pretty. It actually leveraged a UX first made popular in the 1970s in Unix! Docker was just a command run in a Linux terminal—a far cry from the slick graphical interfaces on most PaaS platforms. But the Docker command-line interface (CLI) was really elegant. In fact, I’d argue that the Docker CLI in particular showed the world that when we bring a modern sense of UX to the development of a CLI, it can change the world.

Docker—and containers in general—provided the underlying technology for developing cloud-native applications. They worked, and they continue to work, across highly distributed architectures and within the devops and CI/CD (continuous integration and continuous delivery) models that are required today to meet new and constant customer demands for improvements without regressions (aka bugs, security problems, etc.).

Containers enable developers to change applications quickly, without breaking the functionality users rely on. Furthermore, the ecosystem that has evolved around containers—including the seemingly indefectible Kubernetes orchestration platform—has enabled organizations to effectively scale and manage growing container collections.

Developers quickly understood the value of containers. Operations teams quickly understood, and Silicon Valley investors understood. But it took some work to convince managers, CIOs, and CEOs, who typically watch slick demos, that a command-line tool was better than all of these bells and whistles with PaaS.

Life in a containerized world

And here we are in 2021 with a command-line tool still making waves. That’s pretty remarkable, to say the least. It even appears there’s room for two players in this market for container CLIs (see “Red Hat Enterprise Linux takes aim at edge computing” and “When do you use Docker vs. Podman: A developer’s perspective”).

Now, thanks to the road paved with container technology, developers can work locally or in the cloud much more easily than before. CIOs and CEOs can expect shorter development cycles, lower risk of outages, and even reduced cost to manage applications over the life cycle.

Docker isn’t perfect, and neither are containers. Arguably, it’s more work to migrate applications into containers compared with virtual machines, but the benefits last the full life cycle of the app, so it’s worth the investment. This is especially true with new applications just being developed, but it also applies to lift-and-shift migrations, or even refactoring work.

Docker brought container technology front and center and top of mind, displacing PaaS as the reigning hotness, and for that reason alone it really did change the world.

At Red Hat, Scott McCarty helps to educate IT professionals, customers and partners on all aspects of Linux containers, from organizational transformation to technical implementation, and works to advance Red Hat’s go-to-market strategy around containers and related technologies.

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to [email protected].

Copyright © 2021 IDG Communications, Inc.