HPE CEO Meg Whitman on the Value of a Moonshot

Blog Post • By Meg Whitman


  • Today, HPE announced that it has created the largest single-memory computing system the world has ever seen, capable of holding 160 terabytes of data
  • The system is the latest development from the The Machine research project, HPE's quest to invent the world's first Memory-Driven Computing architecture - a completely new way of storing and processing data.

Why HPE is making a big bet on The Machine research project and Memory-Driven Computing

In May of 1961, 56 years ago this month, President John F. Kennedy challenged the United States to put a man on the Moon by the end of the decade. His call to action rallied the nation's brightest minds and unleashed fearless innovation in its pursuit. Less than 10 years later, two men walked on the surface of the Moon.

To this day, I am inspired by how the realization of President Kennedy's vision captivated the world. An estimated 530 million people watched the Moon landing in awe, and it has had an immeasurable impact on us ever since.

The moonshot,as it came to be known, gave a major boost to the U.S. economy, employing more than 400,000 people at its peak. And the technological advancements that sprang from the endeavor - from the microchip to memory foam - forever changed our lives here on Earth.

The value of a moonshot for business

The term moonshot has since become synonymous with any big ambitious goal that seems out of this world, but holds great promise. And at the right time, with the right motivation, a moonshot mission can be a powerful business tool, invigorating an organization, boosting employee morale and unlocking innovation simply not possible within the confines of everyday operations.

At HPE, we have just such a moonshot in The Machine research project, our quest to invent the worlds first Memory-Driven Computing architecture - a completely new way of storing and processing data.

Conceived specifically for the big data era, we believe this new architecture can deliver critical leaps in computing performance - speed, reliability, energy efficiency - that will allow us to harness all of humanitys data in ways that simply aren't possible today.

Why do we need a new kind of computer?

While technology has improved by leaps and bounds since the first mission to the Moon, the fact is - from the smartphone in your pocket to the world's most powerful supercomputers - the fundamental architecture on which those computers are built hasn't really changed in more than 60 years.

As we quickly move toward an era where everything is connected and everything computes - smart cars, smart homes, smart factories, smart bodies - the amount of data we create will be staggering, and our expectations for what we will be able to do with that data will be equally profound.

By the year 2020, researchers project our digital universe will contain nearly as many bits of data as there are stars in the universe. But what will we do with it all? Despite the enormous amounts of data we're already creating today - more in the past two years than in the entire history of the human race - less than 1 percent of that data is ever analyzed.

The fact is, the incremental increases we are seeing in our computing power will not meet the exponential demands of our future challenges.

We need a new kind of computer - one designed specifically to meet the grand challenges of our day - and we are now one step closer to achieving that mission.

  • HPE has created the largest single-memory computing system the world has ever seen.

The Machine research project & Memory-Driven Computing

In 2014, HPE introduced The Machine research project. In 2016, we delivered the first prototype. And, in just six months, we have scaled the prototype 20 fold.

Today, I'm thrilled to tell you that HPE has created a computer with the largest single-memory system the world has ever seen, capable of holding 160 terabytes of data.

That's enough memory to simultaneously work with the data held in approximately 160 million books - five times the amount of books in the Library of Congress. And, it's powerful enough to reduce the time needed to process complex problems from days to hours.

No computer on Earth can manipulate that much data in a single place at once. And this is just our prototype. Our scientists and researchers believe we can build a Memory-Driven Computer system with up to 4,096 yottabytes of data, more than 250,000 times the size of our digital universe today.

So why is this important? When you can analyze that much data at once, you can begin to discover correlations never before possible. And that ability will open up entirely new frontiers of intellectual discovery.

  • The ability to analyze that much data will open up new frontiers of intellectual discovery.

Preparing for a world where everything computes

The same way the Moon mission brought us satellite TV and smoke detectors, we expect The Machine research project to produce groundbreaking technologies like photonics and non-volatile memory, all of which will be used to improve the ways we work and live.

That is our mission. To enable a world where everything computes. To bring real-time intelligence to every edge of the Earth and beyond. And to help the world harness that intelligence to answer some of our biggest questions, solve some of our toughest challenges and help us better understand the world around us.

This is our next frontier, and I'm excited to explore it together.