Rethinking AI through the lens of efficiency
AI technologies continue to reshape global industries at an astounding pace. But in 2025, we must acknowledge what may be the biggest question facing us as this evolution unfolds—can we scale AI without scaling negative environmental impacts?
These measurable impacts are real and growing fast. As more organizations integrate generative AI models into day-to-day operations, power demands are accelerating. According to the International Energy Agency, data center electricity consumption is set to more than double to around 945 Terawatt-hours by 2030—slightly more than Japan’s total electricity consumption today.
This scenario illustrates the paradox of AI: it can pave the way for a low-emissions energy future through innovations like nuclear fusion and smart grids, but its current energy demands may worsen the environmental issues it aims to solve.
At HPE, we believe there’s a smart way forward, one that doesn’t force a trade-off between digital innovation and environmental responsibility. AI can be a powerful force for good, but only if it’s designed, deployed, and scaled with sustainability in mind.
The big ‘E’: Efficiency (not energy)
Much of today’s conversation around AI and sustainability starts with energy supply. But it’s imperative to first optimize the actual infrastructure and the solutions and systems running the AI.
This means focusing on efficiency first. Technologies like direct liquid cooling (DLC) are now essential for high-performance compute systems, especially as modern GPUs outgrow traditional air-cooled environments. Among the top 10 world's fastest supercomputing systems, seven run on HPE’s leadership-class HPE Cray Supercomputing EX systems based on the industry’s first 100% fanless direct liquid cooling system architecture.
At the infrastructure level, HPE’s ProLiIant Gen12 servers offer significant sustainability gains through improved performance and efficiency. Compared to Gen8 systems, Gen 12 servers can deliver up to 26x the performance, while cutting power and cooling costs by as much as 87%. By consolidating older infrastructure onto Gen12, organizations can dramatically lower their energy footprint, reducing their data center footprint, by up to 96%!
Smart AI for Sustainability
AI itself can be used to advance sustainability across industries. HPE is working with governments, research institutions, and national labs to apply AI in areas like climate modeling, clean energy development, and grid optimization. These are not far-off breakthroughs—they’re happening today.
At the same time, we're developing tools that help customers run AI more responsibly. The HPE Sustainability Insight Center, now integrated with OpsRamp, provides detailed visibility across infrastructure from edge to hybrid and multi-cloud environments, broken down by site and device.
This kind of visibility empowers customers to make smarter trade-offs across performance, cost, and environmental impact. It’s not just about tracking emissions—it’s about informing every decision, from procurement to deployment, with sustainability data that matters.
Designing for the Edge
As AI adoption grows, organizations are rethinking how, and where, their models run. While training large models has dominated headlines, inferencing—the instances where AI applies what it’s learned—is expected to be even more power-hungry. According to Uptime Intelligence, some projections show that inferencing could demand three times more energy than training by 2028.
To meet these demands, many enterprises are turning to smaller, more targeted models that can run efficiently at the edge. This approach reduces the need to move massive volumes of data across networks, a process that’s both bandwidth-intensive and energy-expensive.
Smarter network traffic management can further support this shift. Tools like HPE Aruba Networking AIOps use AI to optimize peer-to-peer configurations and reduce redundancy, increasing network capacity by up to 25%. When paired with energy-efficient networking hardware and intelligent routing, enterprises can reduce unnecessary power use and improve network sustainability.
Building AI Responsibly
AI can be a catalyst, not a threat, for sustainability, but that depends on how it’s built.
Responsible AI starts with intentionality: choosing the right tool for the job, optimizing software and model size, and designing systems to run on energy-efficient infrastructure. But it also means asking the harder question: when is the benefit worth the social and environmental cost of developing and using AI? Striking a balance between transformative impact and resource demand is the core of building more sustainable AI. We know how to design out a dollar of cost. We don’t yet know how to design out a ton of carbon.
Looking to embed sustainability into your AI strategy?
Explore our AI Sustainability Whitepaper to learn more.