Redefining IT Infrastructure to Allow Real-Time Decisions
NOVEMBER 16, 2015 • Blog Post • Sue Poremba, HPE Matter Contributor
IN THIS ARTICLE
- Memory has become more abundant and less expensive, leading to an opportunity to revolutionize IT infrastructure
- Kirk Bresniker, chief architect of Hewlett Packard Labs Systems Research, discusses how The Machine is driving this shift from computing to memory
How The Machine from Hewlett Packard Enterprise will revolutionize computing
By 2020, our world is expected to have approximately eight billion people and 28 billion devices. That adds up to a lot of data, which will be interacting with hundreds of billions of pieces of infrastructure. This wealth of information is expected to change just about everything in our everyday lives. Big Data has the power to reform everything from the way enterprises interact with customers, to the way consumers travel, to the way politicians run their campaigns. As our analytics tools improve, every person and every organization will be able to gather data in real time, allowing them to make decisions based on what is happening at any given moment. The challenge is in the traditional IT infrastructures of today, such as servers, networking and the way we store all of this information. What’s needed is a transformation in the base hardware model of the infrastructure.
Shifting from computing to memory Despite the incredible computing advances we’ve seen in the past decade, the fundamentals haven’t changed since the 1950s. While some pieces of the system are faster—microprocessors that used to be measured in seconds are now measured in nanoseconds—the overall characteristics of computers in 2015 are remarkably similar to the machines of 60 years ago. This has allowed the luxury of maintaining software, operating systems and applications in the same way for years, according to Kirk Bresniker, chief architect of Hewlett Packard Labs Systems Research. New technologies like Hewlett Packard Labs’ The Machine are redefining the way we compute to make our technology infrastructure more adaptable for the changing IT landscape. This will occur in a switchover in the basic economies of computation and memory. Now, rather than the primary focus being on computation to solve problems quickly, we can rely on the stored memory of past solutions to better predict outcomes.
“It has been easier to compute an answer than to remember it,” said Bresniker. “That has to do with the access time of memory and the cost of memory versus the cost of computation.” Thanks to semiconductors, computation advanced more rapidly and has been the more abundant resource. Today, however, we’re seeing a shift in memory technology, Bresniker explained. The balance between computation and memory in problem solving is now tipping more toward the use of memory as it becomes more abundant and less expensive, leading to a transformation of infrastructure. Big Data analytics has all the fire power to be the first practical application of this new infrastructure. Even though organizations tend to hoard data, hoping it will be useful in the future, the reality is that information has a finite shelf life. It is only useful for a limited time, often until it is used for a decision. For instance, flight data is useful only until the plane takes off, and then it becomes worthless.
Applications for the future One area where we can expect to see Big Data be most useful is with the ever-expanding Internet of Things. Technology like The Machine allows for a more fluid, faster transmission of data. Using the infrastructure built into the Internet of Things, the uses for Big Data analytics to drive real-time decisions are endless. Big Data can be used in real time to create new public-sector capabilities, such as determining when to switch the power grid from sustainable resources to non-sustainable resources, or recognize the best time to bring a product to market. This is just the beginning of the new vision of IT infrastructure. We’re already seeing the changes happening in the way we think about sharing information and solving problems. Other exciting changes happening include the ways in which technology like The Machine can approach security (there is no longer a need to copy data from one system to another, so the risk of data compromise decreases) and a new vision for the data center. The Machine and new technologies will allow data to work together, enabling the infrastructure and intelligence to work in tandem. We want to be able to react in real time to ever-changing demands and environmental conditions in order to maximize use of data and technology, Bresniker stated. “Things need to be done in real time,” he added, “and that’s how infrastructure needs to change.”
The infrastructure required to collect, process, store and analyze the future’s data requires transformational changes in the foundations of computing. Click to learn how The Machine will redefine the future of computing for your business.