Building a Better Cloud Through Container Technology | HPE Helion Development Platform

May 7, 2015 • Blog Post • By HPE Matter Staff Writer

This disruption to traditional virtualization is no longer a trend...it's practically a necessity

Containers are making a lot of waves lately and for good reason: the standardized and simple container file format offers not just the promise of faster, more scalable data centers, but significantly more cost-effective ones too, with lower overhead requirements. Containers mean disrupting traditional virtualization as we know it today by leveraging software-defined infrastructure architecture to create a workload-centric model. It is no longer a trend, its practically a necessity.

So, just how big of a game-changer does container technology represent?

"The economics behind running containers compared to virtual machines is so dramatic that it will actually drive itself," said Mark Interrante, senior vice president, Engineering, HPE Helion.

The shift away from operating system-level virtualization and toward application virtualization is one of the most disruptive developments to emerge out of Silicon Valley in recent memory, and the company thats arguably at the forefront of the container movement is Docker.

To be clear, Docker is far from the lone player in this space. Some of the central figures in the container market include CoreOS, Canonical, Heroku, as well as Google, Microsoft, and VMware. The container process model itself isn't a new conceptHP-UX explored similar ideas back in the 1970s, and Linux has offered container functionality since 1991.

But where Docker and other new entrants in the field have been able to differentiate themselves is by offering extremely streamlined solutions that make containers a lot more accessible and appealing to operators.

"Dockers biggest invention is the ability to move application workloads around in a very ubiquitous and uniform way, and combining that with two things that are key: a discovery and distribution mechanism in the form of registries, and a very efficient hosting model through the Docker daemon", said Interrante. "Docker has very effectively created a programming-level abstraction through the Docker daemon, which is why its interesting from a software perspective."

For HPE, container technology is significant because it has the potential to address several pressing needs in the marketplace. Firstly, as organizations move to adopt multiple clouds, it becomes critical to have a mechanism in place to facilitate application virtualization and increase cloud portability. Containers like Docker eliminate the need for expensive transformation and allow organizations to simply move applications between open source-based cloud environments.

Secondly, containers also are an excellent vehicle for improving resource utilization rates than typical hardware virtualization. They provide the computing resources to run an application as if it is the only application running in the operating system. By leveraging containers, organizations can take advantage of numerous efficiency benefits, including the ability to deploy multiple containerized applications on a single cloud instance.

Thirdly, containers are a great tool in facilitating DevOps methodologies. One of the biggest challenges for many shops is moving applications from development to test and production environments. Containers are able to maintain app configuration settings and dependencies, making them invaluable for agile development and support continuous and seamless app development. This means that applications dont break in new environments, and operations teams don't have to spend time with manual installation or troubleshooting issues.

Currently, HPE is working with Docker and CoreOS, and is evaluating other leading container providers, and intends to lend security expertise to build containers that are as secure as they are agile.

HPE is in a position to lead customers with a vision of IT-aaS or IT-delivered as-open-source-code optimized for HPE hardware, based on service, container chaining, and aggregation. A pan-HPE team is working to co-design containerized hardware and software solutions which will allow HPE to create a competitive advantage in automation, monitoring and operations, security (isolation), performance, manageability, and high availability.

For HPE it is not just about how it will create containerized solutions, or even consume them internally, as HPE IT already is. Its also about contributing back to the open source community and helping make containers ubiquitous and scale into the enterprise.

RELATED NEWS

HPE partners with Stephen Hawkings COSMOS Research Group and the Cambridge Faculty of Mathematics

Press Release

From Kubernetes to Docker HPE OneSphere is supercharging cloud application development

Blog Post

Hewlett Packard Enterprise Delivers First-to-Market Data-centric IoT Security

Press Release