Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT leaders

Why containers will drive transformations in the 2020s

Containers and microservices allow enterprises to speed up implementations, choose platforms based on specific applications and optimize environments for success.

Every decade or so a new IT infrastructure model barges onto the scene and changes the way organizations use technology. Client/server architectures put computing resources in the back room and doled them out back in the 1990s. Virtual machines (VMs) created the ability to emulate one computer’s resources on another in the 2000s. Cloud hit big in the 2010s, helping companies become more agile and cost focused.

Now that we’re entering a new decade, what model will dominate the conversation? Based on current trends and expert forecasts, it’s clear that the 2020s will be defined by containers and microservices.

Containers, of course, aren’t brand new. Some describe the technology as another name for VM partitioning, which dates back to the 1960s. Google introduced a container cluster management system in 2003. Docker popularized the concept with the introduction of its orchestration platform in 2013. Gartner forecasts that half of all companies will use some kind of container technology by 2020.

But the emphasis organizations are putting on the technology today is increasing to
the point where they’re making it a critical part of their overall transformation process. Companies we talk to are embarking on different journeys – some to the public
cloud, some to a hybrid cloud environment, others embracing a blend of hybrid IT.
They all have different goals and different timelines. What they have in common is a respect for the value containers and microservices can provide in helping them streamline their IT processes.

Getting containers right is critical to success in any transformational journey. There are many key facets to creating a sound container strategy and questions to answer along the way. How do containers actually work? How does data fit into an overall container plan? How do you secure containers against outside threats? And what’s the end goal of a container strategy? What does “good” look like?

We’ll explore many of these questions in future articles. For now, given the impact the technology is expected to have over the next decade, it’s worth looking more closely at what containers are and why they’re becoming so popular.

Anybody involved in enterprise IT has at least a passing knowledge of containers. Like their physical counterparts, these virtual operating system configurations pack items away for future use. They contain all the executables an IT team needs to run everything from a small microservice like a single HTTP endpoint to a much larger application like a payroll program. Each one has its own binary code, libraries and configuration files – but it doesn’t contain any operating system images. That makes them lighter and easier to transport than applications in traditional hardware or VM environments.

Containers offer a wide variety of benefits. Chief among them are speed, choice and the ability to optimize based on the situation.

 

The need for speed

Speed, of course, is critical in today’s IT world. Moving software through all the various stages of development improves efficiency, increases productivity and allows more time for testing and quality control. Fast processes enable firms to get to market faster and update more frequently. That’s the name of the game.

Using containers, your teams can speed up delivery two ways. First, because VMs contain entire operating systems, they take longer to boot up each time they’re used. Containers don’t need to boot up; the operating system is already there.

Second, teams using containers can release software in smaller segments than they can in legacy waterfall processes. Containers eliminate those pieces of software that stand between the application’s execution and the actual hardware that performs the task at hand. You want to have purpose-built hardware that ideally serves the app alone. If, for instance, you have AI app and want to use a graphics processing unit (GPU), you want to use that GPU as effectively as possible. The more software between the GPU and the orchestration function, the less effectively it will work. Stripping away unnecessary software gives you a higher density of containers per computer and better utilization for that system, thus increasing that speed that can process for that particular use case.

Enabling choice

When you have a standardized container structure – and people have all agreed to support that standard – it allows you to choose which platform you want to use for a particular application. This is important to a multi-cloud or hybrid cloud strategy. Choice matters when you are dealing with issues such as data privacy and data residency. Also, if you have to run an application close to an edge device, you need to have choice about where you can place it.

Having a container strategy enables you to have ubiquity around those who support a set of container platform APIs. Kubernetes appears to be the platform of choice that’s being adopted across industry; it enables people to run that set of APIs that gives users a choice now. Three years from now, you may make a different choice. Because it’s open source, you can decide which distributions work for you in the market space.

Optimizing your environment

There’s a decent amount of debate about whether to use containers with VMs or in place ofVMs. In our opinion, running them separately enables organizations to take advantage of all the lightweight features and optimize their environments for success. You can pack more applications on a host computer using containers in place of VMs. Conversely, running containers inside a VM is like trying to attach a horse to the front of a car. Why take what’s speed optimal and go backward in technology and run it inside a VM?

A container strategy has to bridge both the public and the private areas. Containers enable you to make a step forward to collapsing technology stack and elimination unneeded weight that typically come along with virtual machines.

Now is the time

Without a container strategy you are betting that one of these vendors will solve all your problems. You’re taking a risk that the world is going to evolve how you envision it and you’re limiting your ability to adapt to change.

Containers are here to stay. They offer a host of benefits for organizations seeking speed, choice and flexibility in the way they deliver software. As the new decade approaches, now is the time to start building out what your new ecosystem looks like for data centers and the cloud. That overarching strategy should include containers as a vital piece of your core infrastructure.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.