Enable unprecedented levels of automation and agility with cloud computing solutions.
7 smart new ways to cut energy bills in your data center
Thanks to efficiency improvements, electricity demand in U.S. data centers has been fairly stable since 2010. However, the amount of energy needed to power data centers remains significant.
According to the National Resources Defense Council (NRDC), U.S. data centers consumed about 91 billion kilowatt-hours (kWh) of electricity in 2013—roughly the equivalent of the output of 34 large coal-fired power plants. The NRDC also projects that data center energy consumption will rise to 140 billion kWh annually by 2020, costing companies a total of $13 billion in annual electricity bills.
The bulk of this energy is consumed by corporate data centers, not by those run by Internet Goliaths such as Amazon. Faced with these numbers, organizations are understandably looking for ways to improve energy efficiency. Many strategies are already successfully in use, while others are emerging technologies that will become commercially available in the near future.
To date, efficiencies have been achieved by such technologies as liquid cooling and hot aisle isolation, which improve efficiency by keeping the data center cool. The efficient use of data in virtualization strategies is also becoming a major driver of cost savings, as is the use of more powerful and efficient servers and limiting the amount of electricity available to idle equipment.
Many enterprises can't invest in the huge resources needed to harness the efficiencies of hyperscale data centers, custom power plants, or bespoke data centers. Fortunately, other innovations that address everything from power supplies to air flow can help companies cut energy costs and improve their environmental record at the same time.
Here are seven innovative ways to improve energy efficiency in your data center.
1. Artificial intelligence
In 2014, Google spent $600 million to acquire Deep Mind, a British artificial intelligence (AI) company. Google had already been using machine learning in its data centers to save energy and reduce costs, but the addition of Deep Mind's technology produced dramatically better outcomes. The company cut the amount of energy it used for cooling by up to 40 percent and improved its power usage effectiveness (PUE)—defined as the ratio of the total building energy usage to the IT energy usage—by 15 percent.
In a blog post detailing how it achieved these efficiencies, Google says it used a group of deep neural networks to analyze data collected from thousands of sensors within its data centers. The information included temperature, power levels, and pump speeds. It then used that data to create "a more efficient and adaptive framework to understand data center dynamics and optimize efficiency." The algorithm learned from Google's operational data was able to model plant performance.
The result: The lowest PUE the test site had ever seen.
"Most data center energy waste goes to idle computing equipment and disk storage," says Eugean Hacopians, CEO of ANRE Technologies, an IT services firm. "If you can use some intelligence to shift loads around and shut down or spin down equipment, then there should be a lot of energy savings. Not turning on equipment will not generate heat, therefore you won't need cooling and you'll save more energy. Analytics also can help this process."
2. Fuel cells
Companies are increasingly using fuel cells to power their data centers and cut their energy use. Rather than using combustion to generate power, a fuel cell converts chemical hydrogen and oxygen into water, producing energy along the way—a process that produces significantly fewer emissions than standard fuel-burning technologies. Fuel cells also allow you to generate power on site, minimizing reliance on aging and inefficient electrical grids. And as an added benefit, the heat that fuel cells produce can be used for facility heating and cooling, further cutting costs associated with running heaters and air conditioning units.
AT&T, Apple, and eBay are already using centralized, high-capacity fuel cell installations to power some of their data centers. Taking that a step further, Microsoft is experimenting with colocating small fuel cell units directly inside server racks, making them self-sufficient and reducing or eliminating the amount of energy lost when energy is distributed through a network.
Both Kevin Farnsworth, chief operating officer at Agile Data Sites, and Hacopians acknowledge the promise of fuel cells, but they caution that the technology is difficult to scale effectively because of the amount of hydrogen gas required in the power-producing process. Hacopians finds the idea of building small fuel cells for each rack much more viable, but admits companies will have to solve challenges like how to store and distribute the fuel required to power hundreds of racks.
3. Underwater data centers
Depending on your outlook, the idea of submerging a data center in sea water may seem brilliant or ridiculous. But it's exactly what Microsoft has done with its Project Natick.
In August 2015, the company deployed a data center in a 10-by-7-foot container vessel—dubbed the Leona Philpot, after a character in Microsoft's Halo video game—about a kilometer off the coast of California. The idea is fairly simple: Employing cold ocean water totally eliminates the costs of cooling the data center with air conditioning. And although Leona Philpot is powered by an existing land-based power grid, Microsoft envisions underwater data centers like this ultimately being powered by offshore energy sources such as wave, tide, wind, or ocean currents.
The benefits don't end there, though. Half the world's population lives within 120 miles of the sea. By placing data centers close to where people live and work, Microsoft hopes to reduce latency and accelerate web browsing, file downloads, and other data transfers.
While Hacopians thinks underwater data centers are a "neat sci-fi idea," he questions how economical they will be in the long run due to the challenges of managing and maintaining equipment that is located underwater (and must be kept absolutely watertight). For its part, Microsoft cautions that Project Natick is an early-stage research project. The company notes that during the 105-day-long Leona Philpot experiment, none of its hardware failed, and it was cooled more efficiently than expected. A second, much larger underwater center that could have a half-megawatt capacity is now in the works.
4. Free cooling
In their zeal to ensure the reliability of their equipment, many IT admins overcool their data centers. The average data center has two and half times the cooling capacity needed to keep it running effectively, according to Lara Birkes, chief sustainability officer at Hewlett Packard Enterprise. While technology such as thermal sensors can be used to optimize temperature levels, some companies are eliminating the need for cooling equipment altogether by taking advantage of "free cooling" — using ambient air, instead of artificially cooled air, to reduce heat levels.
Facebook, which boasts some of the world's most energy-efficient data centers, is one of the industry's biggest proponents of free cooling. The social media giant deployed its first ambient air-cooled data center several years ago in Prineville, Oregon. The location was chosen because 50-year weather patterns revealed it to be low in humidity, a critical requirement to prevent hardware damage. The facility brings in cool outside air, reduces its temperature a further 20 degrees with water misting, and then distributes the air to the floor under the racks of custom-built servers and other equipment. Facebook boasts that removing complex HVAC systems allowed it to cut construction costs for this facility by 24 percent. It claims that the facility is 38 percent more energy efficient than a traditional data center.
Facebook has used the free-cooling strategy in every additional data center it has built. However, when Facebook announced last year that it would build a data center in Clonee, Ireland, that will harness cool air from the adjacent Irish Sea, the company noted it would need to filter the air more thoroughly due to its high salt content, which can be corrosive to data center equipment. These kinds of environmental challenges are common, Farnsworth notes, and may be tougher for some enterprises to deal with if they're more attached to their hardware.
"These days, companies like Google and Facebook control their server hardware, which means that they use commodity components and they don't mind if a few boards short out," he says. These companies use low-cost hardware, like that developed as part of the Open Compute Project, an initiative for open data center design and architecture. They are more tolerant of failures that are easily remedied, he notes.
"But most enterprises are not ready to jump on the Open Compute bandwagon, and as such do not want to void their warranties," says Farnsworth. "This means that outside air, along with the moisture and particulates that it brings, is not a welcome choice. Some companies that we have worked with welcome it, though, even for their legacy equipment, and they have seen huge dividends."
5. Liquid immersion cooling
There are several liquid cooling technologies available to data centers, of which liquid immersion is perhaps the most intriguing and effective one. As the name suggests, this cooling method involves immersing data center equipment in a thermally (but not electrically) conductive liquid like refrigerant, mineral oil, or dielectric water in order to reduce heat. By some reports, immersion cooling consumes 90 percent less energy than air-based computer room air conditioner (CRAC) systems.
Today, liquid immersion cooling is used primarily in single implementations of high-performance computing systems. However, Farnsworth sees it as overkill for conventional business workloads. He does predict that some form of liquid immersion will eventually be tailored to the needs of typical enterprises. "Once the business case is worked out," he adds, "it could become unparalleled from a green perspective."
6. Kyoto cooling
This unique cooling method takes a decades-old method of cooling commercial buildings and reimagines it for the data center. Cooling systems designed for office and other buildings use what's known as a thermal wheel made of corrugated aluminum to precondition fresh air. As air is drawn into the building, the wheel (akin to a CPU's heat sink) warms the air in the winter and cools it in the summer, to ease the burden on the HVAC system. After the air is diffused through the building, it's expelled back outside.
Kyoto cooling (named after the 1992 climate change treaty known as the Kyoto Protocol) uses the same basic idea with a few modifications. Instead of intake and exhaust paths, the Kyoto system uses isolated circulation paths for inside and outside air streams. Fans draw heat from servers and into the Kyoto cooling unit, where a constantly turning "Kyoto Wheel" absorbs the heat just like a thermal wheel and sends the cool air back to the data center.
In a separate space, outside air is also drawn into the Kyoto cooling unit. The Kyoto wheel absorbs that heat and disperses it, along with the heat from the data center, back outside. Kyoto cooling has proven amazingly efficient, using up to 92 percent less power than conventional cooling methods.
7. Micro data centers
The workload demands driven by the growth of mobile computing and the Internet of Things have made running services and applications out of a few centralized data centers a lot less efficient than it used to be.
One response to this has been the deployment of micro data centers. These rack-level systems contain a complete data-center infrastructure—including UPS, security, cooling systems, and fire protection—in a self-contained module. Deployed to thousands of locations around the globe, micro data centers bring data and applications closer to the edge of the network, ensuring a seamless experience for end users no matter where they are.
In a research study, the Uptime Institute found that micro data centers were up to 15 percent more energy efficient than centralized data centers, and that they lowered total operating costs by as much as 50 percent. Little wonder, then, that the micro data center market is expected to grow to $6.3 billion by 2020.
Green is the new black
These sustainable approaches represent the frontier today, but some could be data center staples in just a few years. You can prepare for this brave new world by continuing to focus on energy savings at the IT equipment level. The biggest tip of all: Look at the big picture, Hacopians says. "One needs to realize that you can't change just one variable to achieve efficiency. It needs to be a holistic approach."
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.