To Handle the Big Data Deluge, HPE Plots a Giant Leap Forward With “The Machine”

July 1, 2014 • Blog Post

IT planners are looking out a decade and recognize that the current mix of technologies will have trouble keeping pace with the exponential growth of Big Data

In 1965, Gordon E. Moore published the now-famous prediction that computer processors would double in power every two years. And despite widespread doubts about how long exponential growth could continue, Moores Law still holds true, roughly half a century later.Now, though, industry leaders are beginning to fret about a dilemma of a much larger order. The problem is hybrid in nature, encompassing the combined challenges of improving processor speed, storage capacity and networking throughput simultaneously to meet the towering growth demands of data coursing across global networks.

  • Few understand just how big Big Data has gotten.

Few understand just how big Big Data has gotten. Many of us probably still think a terabyte is a lot of data. Today, our digital universe is about four zettabytes. To put that in perspective, while one terabyte can store roughly 100,000 minutes of music, one zettabyte can store just over two billion years of music. By the end of the decade, well be starting to use a unit that few people have ever heard of: the brontobytea billion exabytesor two quadrillion years of music.At an enterprise level, IT planners are looking out a decade or so ahead and recognize that the current mix of technologies will have trouble keeping pace with the exponential growth of Big Data.For Hewlett Packard Enterprise, the solution to this challenge has a deceptively simple name: The Machine. The name belies the ambitious scope of a technology development path with few antecedents in company or industry history.The Machine, a program announced in June 2014, aims to solve this rising problem by coordinating and advancing four emergent technologies in parallel to prevent the possibility that the rising data flow could flood conventional legacy technologies.

HPEs Recipe:

 

  • System on a chip. It starts with replacing general-purpose processors with special-purpose cores integrated with memory and networking into a single chip package. Building on the foundations laid by our revolutionary moonshot microservers, this promises to slash the energy that conventional microprocessors require and chomp through huge amounts of data much more rapidly. Its like having a toolbox full of specialized tools rather than a Swiss Army Knifethe right tools are faster and more efficient.

 

 

  • Memristors. Today, all our devicesfrom phone to supercomputerconstantly shuttle information between three layers of memory: whats needed this instant (SRAM), what will be needed very soon (DRAM) and what may be needed later (storage). Memristors will be fast, dense and cheap enough to play both the soon and later roles at once and thereby speed up throughput by eliminating most of the to and fro. Critically, memristor is also nonvolatile, meaning that no electricity is needed to maintain the data. This massively reduces the energy required to store data and makes systems virtually immune to power cuts.

 

 

  • Photonics. Today, we use fiber optics to move data over long distances. To boost the throughput of information flowing between processor cores within data warehouses, HPE is pushing ahead with optical links that relay bits via photons rather than electrons. This eliminates copper wires as the conduit and with them, a big source of energy and space inefficiency. High speed photonic fabricsa term used to mean the web of connections between processor coreswill allow unprecedented storage and computational resources to be marshalled under a radically simplified programming modelmoving data between hundreds of thousands of optimized computing cores and exabytes of memristor storage.

 

 

  • OS. Hardware alone wont solve this problem. The fourth piece of HPEs vision is a brand new operating system that orchestrates the flow of data between these hardware upgrades. Virtually all software in existence is written to cope with the limitations of conventional architecture. Coders will be able to create applications that can manipulate and extract meaning from vastly larger data sets than is possible today.

 

So what does this all mean?

If HPE can deliver on these technologiesits timeline to do so reaches out through 2020the benefits will be enormous, with quantum leaps in performance and energy efficiency. The problems of having to build and find electricity for thousands of new data centers will effectively disappear. Computational tasks that today require government levels of funding and legions of data scientist will be within the reach of almost anyone.As with all innovations that come with great promise, The Machine will face its ultimate test once its in the field. Once its there, HPE hopes it will not only address critical business needs today, but, more importantly, the ones we will face tomorrow.

RELATED NEWS