Beyond virtualization: How to boost data center efficiency and save money

Virtualization is the beginning, not the goal. To gain an advantage in energy efficiency and therefore financial savings you need to do more.

Virtualization has been one of the main drivers of data center efficiency. It allows enterprises to use fewer servers and dial back their energy use as a result, taking some of the sting out of reports that data centers are using an outsized percentage of the developed world's energy.

But for some companies, the benefits of virtualization may have hit a peak. Most already know it's something you do if you want efficiency, but many are doing only the basics. There are additional moves companies can make if they want to boost the efficiency of their data operations and save money. Many have reached the maximum threshold for off-the-rack efficiency, according to Laura Cunningham, a data center consultant for HPE Technology Consulting.

Thankfully, a growing number of tools are available to help CIOs curb energy use and manage their data more efficiently. Building on a foundation of virtualization, a smart data center operator can increase cost savings by optimizing virtual instances through techniques such as utilization and deduplication, or through the strategic use of cloud and emerging technologies such as edge computing and fogging.

Get the report from 451 Research.

Datacenter Modernization: Trends and Challenges

Optimize, utilize, deduplicate

One of the factors contributing to a ceiling in virtualization benefits is that companies will often virtualize a new server but won't do the same for older equipment. "Go back and eliminate the physical infrastructure that existed in the past," says HPE master business consultant Bob Graham. "In most situations, this has not been done."

There are a number of reasons for avoiding that extra—and ongoing—step. The company or division may not have funding for it, or an important application might not be supported. Regardless of the nature of the obstacle, "the key thing is to go back to the well and virtualize servers and storage" that were left behind, Graham says.

A related task that can bring your virtualization up to its limits is "zombie hunting"—essentially, shutting down equipment that's using up power while doing nothing. "Any organization is going to have some zombie servers, and decommissioning them saves a 300-watt load for every server," says Roger Tipley, president of Green Grid, a consortium focused on advancing data center efficiency. Once a server hits 67 percent of use, it's about as highly utilized as you can get. At that point, to squeeze more efficiency out of the server, you have to optimize your operations.

Fine-tune utilization

Once you've done everything you can to virtualize, you can begin to fine-tune your utilization. Many companies are unaware how much storage and compute they are actually using, Cunningham says. Typically, they also don't know where that storage and compute sits, sometimes not even which data center any given datum occupies, much less which server, according to Cunningham.

Tipley recounts a meeting with a data center architect from a large tech company who was very worried about growing out of server space. "But after deduplication and other utilization improvements, it turned out to be the opposite," Tipley says. From six enterprise data centers, this company went down to just four, reducing energy costs by 33 percent.

Graham worked with one company that turned out to have 12 full copies of its master files. A group would duplicate them for development, another for recovery, another for staging, another for training, and so on. Deduplication cut the space and energy required for these excessive backup copies, resulting in substantial savings.

Embrace the cloud  

Cloud technology has emerged as a key tool to maximize data center efficiency. A highly virtualized company is more likely to use cloud technology and cloud services, according to Cunningham. In some cases, the company is outgrowing its data center and turns to the cloud for added capacity. In other cases, the company has unpredictable capacity needs, and the data it adds to the cloud is not mission-critical.

A company that does not already have a data center and needs to quickly get something up and moving may benefit from employing cloud technology, says Graham. You can run up your storage and compute in short order, though you have to be careful about what types of data you move to the cloud. He notes that mission-critical, steady-state materials are least appropriate for a cloud environment.

Lately, cloud providers have created more robust security environments that offset, to some extent, the vulnerability of cloud technology to hacking. This helps make the case for cloud deployment from both an energy and a security perspective.

Get an edge

Edge computing is a new element in the calculus of data center management. Organizations currently struggle with an average roundtrip data penalty of 53 microseconds per 1,000 miles, according to Tipley. High-speed trading and autonomous vehicles are two examples of technologies that can't tolerate this much latency.

When you locate compute and memory in the same place, or at least very close to each other, and spread out the nodes where they interact, you do not have the same need to send data to a central location to be processed and then send the processed data back out to be implemented. Instead, the data is processed in situ, making computing faster while reducing energy expenditure and cooling needs.

Forecast: fog

Fog computing and edge computing are terms commonly used in reference to the Internet of Things (IoT). Fog computing is a decentralized computing infrastructure that distributes compute and storage as close to the end user as possible, instead of centralizing it in data centers. The processing happens in a data hub on a smart device or in a smart router or gateway, thus reducing the amount of data sent to the cloud.

Edge computing takes localized processing a step further by bringing computing even closer to the source of data creation. Each device on the network has a role to play in processing the data it generates.

Fog computing is already prevalent in smart buildings, smart grids, and software-defined networks. In 2015, Cisco Systems, Intel, Microsoft, Princeton University, Dell, and ARM Holdings funded the OpenFog Consortium, which now has almost 50 corporate and institutional members. 

As use of the IoT becomes more widespread, these two computing solutions offer similar methods of processing data that manage it more efficiently and, ultimately, relieve the pressure on your data center, increasing cost savings.

Data center efficiency: Lessons for leaders

  • Virtualization is table stakes. Your competition is also virtualizing. 
  • A sound data center strategy should include such strategies as utilization and zombie hunting.
  • Edge and fog computing are emerging technologies that can boost efficiency, reduce latency, and save energy. 

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.