Imagine you’re shopping for a new laptop. You find a model that can do some pretty cool stuff, but it uses much more electricity than your current one—ten times, maybe thirty times more. But no one can tell you the exact amount because it’s a closely guarded secret.
Oh, and there’s a catch: this laptop comes with a funnel on top. Every time you ask it to tell you a joke or generate a fun image, it needs a water refill. How much water? No one knows, because the company won’t say. Sound like an upgrade worth making? For those of us who care about an ever-warmer, ever-thirstier Earth, probably not.
That laptop is a metaphor for our current AI gold rush. Or at least something like it. The truth is, we don’t have precise numbers on the environmental cost of AI. The carbon dioxide emissions generated by each AI prompt, not to mention the vast amounts of water needed to cool the thousands of servers processing those prompts, are still a mystery. Researchers can only provide rough estimates, while companies like Google, Microsoft, and OpenAI hold the exact data close to their chests.
Since the launch of ChatGPT in 2022, there’s been a noticeable crackdown on transparency. Sasha Luccioni, a seasoned expert in AI energy usage and climate lead at Hugging Face, describes it as an information blackout. “Not a single company that offers AI tools, that I know of, provides energy usage and carbon footprint information,” Luccioni says, clearly frustrated. “We don’t even know how big models like GPT are. Everything is a company secret.”
AI Is Making Us All Dirtier
It’s ironic. Tech giants like Google and Microsoft love to brand themselves as climate-conscious. They can tell you exactly how many kilograms of carbon your next plane flight will emit. But when it comes to your next AI-written essay or that AI-generated image of the Pope in a puffy jacket, they’re mum.
There might be a good reason for that silence. If we knew the true environmental cost of AI, we might start holding each other accountable for using it so recklessly.
Despite the secrecy, we have some idea of the scale of the problem. In its 2024 sustainability report, Google revealed that its greenhouse gas emissions skyrocketed by 48% between 2019 and 2023, with the bulk of that increase occurring since 2022. Microsoft’s 2024 sustainability report shows a similar trend, with a 29.1% rise in emissions since 2020.
Both companies blame third parties—specifically, the data centers built to support their AI operations. They also note that these data centers handle more than just AI tasks, which is true and contributes to the difficulty of pinpointing AI’s exact energy cost. However, they can’t fully deny the obvious: the surge in construction is driven by the need for data centers “designed and optimized to support AI workloads,” as Microsoft puts it.
Google’s report acknowledges, “We have a long way to go to meet our 2030 target.” Considering that data center energy demand is projected to increase by 160% by 2030, that’s quite the understatement. A May 2024 Goldman Sachs report estimates that the carbon dioxide emissions from data centers could more than double between 2022 and 2030.
So, who’s to blame for this surge? As Google’s report delicately puts it: “Reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute.”
But don’t tell that to Sasha Luccioni. “That always pisses me off,” she says. “AI isn’t a vertical. It’s a horizontal—a tool that’s used across many different verticals. Google Maps uses AI, as do all the ads we see online, precision agriculture, military drones. How do you calculate what part AI plays?”
In other words, Google isn’t forcing you to use cryptocurrency when you do a Google search. But it has made AI-generated search results a default feature—you can’t opt out. So even if you think you’ve never used an AI tool, if you’ve Googled recently, you’re contributing to the problem. For those concerned about the climate impact, Luccioni suggests switching to a non-AI search engine like Ecosia.
The Thirst of AI
If Google, Microsoft, and the other big players in generative AI were to fully disclose their data, just how bad would it be? That’s a tough question. Expert guesses range from “pretty bad” to “climate disaster.”
The International Energy Agency estimates that a single ChatGPT prompt consumes nearly 3 watt-hours of electricity. In comparison, a single Google search used to require just 0.3 watt-hours—before AI was integrated into the results. The energy required to answer hundreds of millions of ChatGPT queries each day could power 33,000 U.S. households, according to University of Washington researcher Saijad Moazeni. And that figure doesn’t even account for the energy consumed during the training of these AI models, which is anyone’s guess.
AI also has an insatiable thirst for water. When OpenAI was in the final month of training its GPT-4 model at Microsoft data centers in West Des Moines, Iowa, the facility had to pump in 11.5 million gallons of water—6% of the entire district’s water supply. The situation was so dire that West Des Moines warned Microsoft not to build any more data centers unless it could reduce its water usage, echoing similar concerns in Arizona and a 2021 water dispute in Oregon involving Google data centers.
There is some good news. Data centers are increasingly drawing water from non-potable sources, and companies are finding ways to use less water overall. Some data centers are adopting special HVAC systems that reduce water usage, although these systems do increase electricity consumption.
Can AI Help Us Go Green?
What about the exponential growth of wind and solar power? Surely that could help sustain our AI revolution, right?
Not so fast, say researchers. It’s impossible to know whether your AI query is being processed in a data center powered by green energy in Europe, or by coal in India, or by oil in Saudi Arabia. Even Europe isn’t greening its grid quickly enough to keep up with Silicon Valley’s AI obsession.
“Renewable energy is definitely growing,” Luccioni says. “The problem is it’s not growing fast enough to keep up with AI’s growth.”
Tech companies are trying to bridge this gap with carbon credits, which, as a recent Bloomberg investigation points out, isn’t the same as reducing emissions. Microsoft and Amazon rely on credits for more than 50% of their so-called renewable energy. Meta, on the other hand, does slightly better, with just 18% of its green energy coming from carbon credits. Luccioni also gives Meta some credit for being more transparent about its AI data, partly because the company currently has less at stake in the AI game.
Even if AI-focused data centers were powered entirely by renewable energy, they would still be hogging green power that could be used elsewhere. This isn’t just a theoretical debate. In Pennsylvania, Amazon is fighting locals over the output of a 2.5 GW nuclear power station, where it plans to build new data centers. This could be the first of many similar legal battles across the country.
The Dilemma of AI’s Environmental Impact
So, can AI help us become more environmentally friendly? Could it help us model extreme weather more accurately, or perhaps develop scalable carbon capture solutions? That’s a possibility, and one worth exploring in a future discussion.
But for now, one thing is clear: most of us, whether we’re Gen Z students turning in AI-written papers or boomers sharing AI-generated cat pictures, are not using AI to save the planet. Maybe it’s time to consider leaving this powerful tool to those who are.