5 Trends This Year Defining High Performance Computing | HPE HPC

MARCH 16, 2016 • Blog Post • By Bill Mannel, HPEs VP


  • High performance computing (HPC) is on track to completely change the way enterprises around the globe operate

HPE’s Bill Mannel shares insights for the future

Once the domain of obscure academics and gilded research institutions, high performance computing (HPC) is making its way into the enterprise.

I'm seeing it all over the world, and I'm not the only one. According to IDC, large-scale server sales are on track to grow 8.2 percent annually from now through 2019. Data storage for supporting HPC environments is expected to grow by 9.9 percent annually over the same period. Deep-pocketed organizations are increasingly committing the resources needed to glean intelligence that's been hidden in code. As a result, HPC is about to become one of the essential ingredients for conducting business in the data-driven era in which we live and work. Here are five predictions for how it will change enterprises around the globe:

1.National commitments to exascale computing will filter down into commercial systems. Driven by the National Strategic Computing Initiative (NCSI), the U.S. is locked in a race to be first with national exascale computing capability. Think of it like the Space Race of the 1960s, but with investment going into software and large-scale systems rather than rockets. The stakes are different but the potential for world-changing results is similar. At the very least, processing 1 billion calculations per second—1,000 times more than was possible just seven years ago—changes how we pursue scientific discovery. Rather than test hypotheses at a macro level we can get very precise at a micro level, unleashing precision that leads to better products. Think of a wind turbine. Without HPC, researchers could only test for very general system-wide improvements. With it, they can analyze the impact of friction at multiple points on a single turbine blade, making improvements a millimeter at a time. Efficiency gains can multiply quickly at that level, leading to a whole new paradigm for investing in clean wind energy.

2.New applications will create more opportunities to use HPC systems in the enterprise. A few years ago, Big Data was an esoteric one-off. No longer. HPC systems are now wading through seas of corporate data in pursuit of profit-making insights. But correlation offers the biggest potential payoff. Think of the life insurance company that already stores petabytes worth of data on consumer trends. Rather than just collecting all this data and letting it sit in storage, HPC can be used to derive actionable intelligence from the collected bits and bytes. For instance, mixing in social signals from Facebook, Twitter and elsewhere can help sharpen behavioral insights and improve models. As a result, premiums could become more accurate—and more profitable.

3.HPC will revolutionize how major enterprises do business, starting with financial services. HPC isn't just about finding and crunching the right data. Some are using it to change the way they do business. Look at the major banks and investment firms. A majority are now using HPC to crunch reams of data in microseconds to better compete in the bare-knuckles world of high frequency trading (HFT), where orders are routinely executed in 1/100ths of a second. Manufacturers, too, are taking to the technology. Automakers, aircraft makers and other designers of industrial equipment are using HPC systems to conceptualize large-scale products before building them, saving huge sums by eliminating prototyping mistakes while they're still visible on screen. How and when companies of size go to prototype is bound to change in the process.

4.Tiered storage models will support HPC adoption in the enterprise. HPC systems make use of data at a scale never before seen. The bad news? Feeding these systems means storing data efficiently, and tiering offers the best approach. Specifically, enterprises will hold the most-accessed data in "fast-scratch" systems that act as a high performance cache. Underneath that will be a reliable layer of storage that's composed of disks and all-flash arrays. These will be tiered so that the fastest media holds the data that needs to be accessed most often. And finally, backups of all crucial information will be archived in "cold storage" media such as tape. We'll also see these arrangements become software-defined, making it easy to add and alter storage infrastructures as HPC needs change.

5.HPC will be delivered flexibly, as a service, using a combination of consumption models. While we tend to think of high performance computing in terms of a single system or cluster, enterprises' growing data needs require something more flexible. In response, we'll see vendors lean toward open architectures that allow for additional compute and storage to be accessed and consumed at will. We'll also see internal departments leveraging software-defined data center assets for controlling on-premises HPC clusters and compiling an associated software stack—an entire solution, if you will—through a single console, on demand, as if it were housed in the public cloud. In this way, hybrid infrastructure will come to include hybrid HPC.

When will see these shifts occur? Many are happening already. Hybrid infrastructure environments are especially common in modern enterprises. Adding HPC systems to the mix makes sense when you consider how much data they're already handling — and how much more they could be if given the chance and the tools.

Companies that adapt quickly and form HPC-driven enterprises will enjoy the greatest rewards from having a processing advantage. Those that don't will suffer as laggards tend to, so choose wisely.

Bill Mannel is vice president and general manager of High Performance Computing, Big Data and Internet of Things at Hewlett Packard Enterprise. HPCwire recently named him to its 2016 "People to Watch" list.


Big Data changes everything. HPE Software gives you the power to transform it into actionable intelligence so you can capitalize on new opportunities and solve real problems in the moments that matter.


HPE Delivers Worlds Largest Arm Supercomputer for U.S. Department of Energy (DOE)

Press Release

U.S. Dept. of Energy Taps Hewlett Packard Enterprises Machine Research Project to Design Memory-Driven Supercomputer

Press Release

U.S. Dept. of Energy Taps Hewlett Packard Enterprises Machine Research Project to Design Memory-Driven Supercomputer

Press Release