Power Tussle: The Hidden Climate Costs Of AI

Published 23 days ago
Tiana Cline
Specialist technician professional engineercontrol drone checking top view of installing solar roof panel on the factory rooftop under sunlight. Engineers holding tablet check solar roof.
(Getty Images)

Our smartest technology just might be accelerating environmental challenges rather than solving them.

If you’re finding that you’re using generative artificial intelligence (GenAI) more and more to search for things online, you’re not alone. There are a growing number of people who are ditching traditional search engines like Google for large language model (LLM) chatbots like OpenAI’s ChatGPT and Anthropic’s Claude. And it makes a lot of sense. Instead of paging through lists of links hidden in-between ads, you get a direct answer to your question (with sources). It sounds brilliant but there’s a catch – running AI models (especially LLMs) require a lot of computational power. They also rely on massive data centers that can consume the same amount of electricity as a small city. And the more popular AI becomes, the more power these large-scale operations will need, with McKinsey & Company predicting AI power consumption could reach up to 219 gigawatts by 2030 — that’s supposedly enough electricity to power more than 163 million (average) homes.

The hyperscalers, large cloud service providers like Amazon, Google, Meta and Microsoft, are all scrambling to secure renewable and low-carbon energy supplies. Both Meta and Microsoft are betting big on nuclear to bolster electric grids. Microsoft recently signed a 20-year power deal with Three Mile Island, an infamous nuclear plant in Pennsylvania that shut down in 2019.

Advertisement

“A lot of the energy plans that the big cloud providers and data centers have is focused on sustainable energy. Nuclear is not renewable but at least we are not burning fossil fuels,” says Chris Wiggett, Head of AI and Data Analytics at NTT DATA Middle East and Africa.

“Connecting to a grid is expensive,” adds Emad Mostaque, a British entrepreneur and the former CEO of Stability AI. “The human brain uses 20 watts. When ChatGPT first came out, it used over 1,000 watts for every influence, every query.” Now, every time someone puts in a prompt into ChatGPT,

Loading...

it uses around 2.9 watts of energy, every hour. The energy use may be lower but it’s still nearly 10 times what it takes for a single Google search.

“We’re looking to build Africa’s largest interconnected hyperscale data center, but we need to be as efficient as possible in our design and operation,” says Finhai Munzara, ADC’s interim CEO. “Data centers are big capital and come at a high cost to operate. We do use a lot of power so it’s important to ensure that we take some of the burden off the grid and generate as much on-site as we can.” Munzara is also looking into what happens if they overproduce solar power – whether they will be able to bank that excess energy into the grid and then offset it against some of their other emissions elsewhere. “It’s not just about delivery capacity. It’s about how we are going to power it in the most sustainable way.”

Advertisement

Hugging Face – an open-source hub for AI – has over a million models in their library. The platform’s CEO, Clément Delangue, wrote that a new model is created every second on the platform. A Harvard report found that the training process for a single AI model, like an LLM which is what GenAI models are based on, can consume thousands of megawatt hours of electricity.

But Wiggett believes that as AI models evolve, there’s also a chance that they will consume less power. Open-source AI models, which are often collaborative, can be lower in energy consumption compared to a closed, proprietary AI model owned by a business. Wiggett also explains that algorithm solutions require less compute power than their AI counterparts.

A  great  example  is DeepMind’s AI model, AlphaGo, which was built in 2015 to master the ancient Chinese game of Go (a complex board game of strategy, creativity and ingenuity.) “AlphaGo was the first model, and it took a very long time and massive compute power to perform at the level it did. If you look at the subsequent models, they are all algorithm-based yet they outperform the first models,” says Wiggett. “If we look at the evolution of AI, we have to go heavy compute, and then over time, the evolution will involve going lesser compute. From a technology perspective, it will get better over time.”

You can already see a shift in how data centers consume water. Despite Amazon, Microsoft and Google all reporting that their water usage has escalated due to AI demands, the hyperscalers have plans in place to become water-positive in the near future.

Advertisement

Amazon Web Services (AWS) is funding over 20 worldwide water replenishment projects. In other words, they’re giving communities water instead of taking it away. In late 2024, Microsoft launched a new data center design that optimizes AI workloads and consumes zero water   for cooling.

“Zero water means that until a data center is built, all the water pumped into the tanks is circulated in a closed loop,” explains Munzara. “The water is cooled with ambient air.” Water usage efficiency (WUE) is a metric that has been tracked since the first generation of data centers. To cut water costs, data centers are often built in icy areas to take advantage of natural cooling. Microsoft has even conducted experiments to see if underwater data centers could work and now, an American company called Lonestar, is looking to put data centers on the moon.

“The moon has no weather, no climate change, no atmosphere and it has a perfect view of Earth rotating underneath it,” said Lonestar’s CEO, Christopher Stott, in an interview. In the meantime, while AI workloads continue to strain data center resources, other eco-friendly cooling options to manage energy efficiency will have to do.

Advertisement

Loading...