Does AI use a lot of energy and water? Yes and no

Ever since artificial intelligence became a topic of popular conversation, the environmental cost of all these large-language models and the massive server farms that make them possible has been the subject of much concern. Every week or two, it seems, there is another article about the vast appetite these systems have for both power and water, the greenhouse-gas emissions, etc. and the impact on the environment. So I found it interesting to read Google's assessment of these factors in a recently published study entitled "Measuring the environmental impact of delivering AI at Google Scale." The company also wrote a blog post summarizing some of the numbers, in which it said that the average Gemini prompt "uses 0.24 watt-hours of energy, emits 0.03 grams of carbon dioxide equivalent, and consumes 0.26 milliliters – or about five drops – of water." The overall per-prompt energy impact, according to Google's scientists, is "equivalent to watching television for less than nine seconds."
Since Google runs Gemini and obviously wants to understate how much power and water it uses, you might be skeptical of these results, as I (and others) were when the paper was released. The Verge, for example, wrote a piece quoting a number of experts who said that the Google study was misleading because it "omits some key data." What key data? If you read the article, it says that Google only looked at the direct water and power use of its server farms and related AI equipment – that is, the amount of water and electricity that these systems consumed while running Gemini queries – as wellrather than looking at the indirect use. That would include water consumed by power companies that generate the electricity to power these data centers, whether it's water to drive electrical turbines or to cool gas or nuclear power systems. From the Verge article:
“They’re just hiding the critical information,” says Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside. “This really spreads the wrong message to the world.” Ren has studied the water consumption and air pollution associated with AI, and is one of the authors of a paper Google mentions in its study. A big issue experts flagged is that Google omits indirect water use in its estimates. Its study included water that data centers use in cooling systems to keep servers from overheating. As a result, with Google’s estimate, “You only see the tip of the iceberg,” says Alex de Vries-Gao, founder of the website Digiconomist and a PhD candidate at Vrije Universiteit Amsterdam Institute for Environmental Studies who has studied the energy demand of data centers.
So Google is playing fast and loose with the data, right? Except that the company says right in its paper that it is only including direct water and energy use, so it's not really hiding it per se. But still, wouldn't it be better to include indirect use as well? Sure. So Andy Masley, a former high-school physics teacher who now runs the DC branch of Effective Altruism (a group that tries to apply reason and factual analysis to charitable contributions and activity) did exactly that in a response to Google's post and the Verge piece. Based on public data related to the water and power consumption of data centers in the United States, if Google included the indirect water use of power plants generating the electricity that it uses for Gemini AI prompts, then Google's AI systems would use about 1.6 mL of water per prompt, or about 32 drops instead of five. So that's about six times as bad, right? Sure. But as bad as what? Here's how Masley puts it:
Putting this number in context, the average American’s lifestyle uses 1600 liters of water per day when you include our food production, energy generation, and at home use (this consumptive use of specifically surface and groundwater, or “blue water”, where the water is not returned to its original source). This means that Google’s original claim was that each Gemini prompt uses 0.000015% of your daily water footprint, and the correct value (that Google is quote “hiding” from you) is 0.00010%. Instead of 1 in 6.7 million of your daily water use, a Gemini prompt is actually as much as 1 in 1 million. Google originally misled you by announcing that it takes 650 Gemini prompts worth of water for 1 second of your shower. In reality, it only takes 97.
Note: In case you are a first-time reader, or you forgot that you signed up for this newsletter, this is The Torment Nexus. You can find out more about me and this newsletter in this post. This newsletter survives solely on your contributions, so please sign up for a paying subscription or visit my Patreon, which you can find here. I also publish a daily email newsletter of odd or interesting links called When The Going Gets Weird, which is here.
AI and your microwave

What Masley is trying to do is to compare AI or data center energy and water use to some other normal human activities. Knowing that AI prompts require ten times as much energy as a Google search is only relevant if you know how much energy a normal Google search requires. Do you know how much water the other things you do in your daily life might use, either directly or indirectly? I bet you don't – I certainly didn't. And in case you (like me) have thought to yourself "How much should I trust a guy who used to teach high-school physics and runs an Effective Altruism group?" there are very similar numbers and analysis in a post from Hannah Ritchie, a Scottish data scientist and senior researcher at the University of Oxford, and deputy editor at Our World in Data. "Google says that its median text query uses around 0.24 Wh of electricity," Ritchie writes. "That’s a tiny amount: equivalent to microwaving for one second."
Other industry estimates (for what they might be worth) are in a similar range: Sam Altman, CEO of OpenAI, published a blog post in June in which he mentioned that a standard text query uses 0.34 Wh of electricity, which is similar to the estimate that the nonprofit research outfit EpochAI came up with in February. And in her post, Ritchie mentions that Mistral AI, a French AI company, conducted an environmental analysis of its LLMs and their energy use earlier this year, and employed a life-cycle assessment that the company said was conducted by external consultancy agencies. While the methodology "was not that transparent or detailed," Ritchie says that it did provide breakdowns of where energy impacts came from in the overall process of developing and running an LLM, and the impacts were low: just 1 gram of CO2 per page of text.
As somone who used to write this kind of article for a living, I confess that I am sensitive to the criticisms that Masley levels against the Verge piece, including the fact that the headline and the way the article is framed make it sound like Google is hiding something important, and imply that by doing so, the company is covering up the massive energy use of its server farms.
Stop and think about what this headline conjures up for you. For me, it sounds like Google is being shifty with numbers to present its AI model as not using much water, and in reality it’s using enough water to at least consider as a possible problem. This leaves us with a misleading headline and article. Readers are left with the sense that Google’s actively lying to them about the true water costs of their models, and the unstated but seemingly obvious next step is to assume that the water costs are significant. In reality, Google’s paper makes it clear that they’re measuring only the water in data centers themselves. Including the offsite and onsite water means each Gemini prompt uses 1/38,000th of the water cost of a single beef patty. I think readers would come away from this article with a wildly different and wrong idea of how much water is involved.
A comment on the Masley post also notes that the scientist (or PhD candidate) quoted in the Verge article, Alex de Vries-Gao, is not really an expert on either artificial intelligence or computing technology (or power consumption, it seems) and was last in the news for this article on AI and energy consumption, which got a fair amount of attention. The problems with this paper include the fact that "it essentially assumes that every GPU manufactured is used exclusively for AI training (no consumer use, no non-AI data center use, no inference or inference-specific hardware) and runs at its theoretical TDP [or maximum power consumption] forever." It also conflates manufacturing energy with operational energy instead of amortizing it over the hardware's lifespan. Accounting for these errors, de Vries-Gao appears to be overstating the power consumption of AI by at least 200 percent and possibly 300 percent.
Play golf or use ChatGPT?

In a previous post on the same general topic, Masley argued that "discouraging individual people from using chatbots is a pointless distraction for the climate movement" because the amount of water and energy used by ChatGPT and the other major AI engines is a rounding error in the larger scheme of things (things being society in general). Even assuming that a ChatGPT query uses 3 watt-hours of energy, or ten times as much as a Google search (Gemini actually uses an order of magnitude less than that, according to the recent paper), this amount is a vanishingly small number compared to some other normal human activities that we take for granted. According to Masley's estimates, 3 watt-hours (which costs about 5 one-hundredths of a cent in Washington, DC) is enough to leave a single light bulb on for 3 minutes; play a gaming console for 1 minute, run a microwave for 10 seconds, or use a laptop for 3 minutes.
In other words, it's true that AI systems like ChatGPT use energy, but it isn't really that much compared to other things in your life that you've grown accustomed to. You could argue -- and many people definitely have -- that AI adds nothing significant to our lives and is therefore a waste of resources. But you could make the same argument about other things that human beings routinely do that produce very little in the way of broader societal benefits, and one of my favorites to pick on is golf (which I enjoy playing but am not very good at). According to Construction Physics, the US has around 16,000 golf courses, and they use about a billion gallons of water a day, or 0.3% of total US water use. Data centers in the US used around 66 million gallons per day in 2023, which works out to about 6% of the water used by US golf courses. Here's Masley again:
If a steel plant or college or amusement park were built in a small town, it would be normal for it to use a lot of the town’s water. We wouldn’t be shocked if the main industry in the town were using a sizable amount of the water there. If it provided a lot of tax revenue for the town and was not otherwise harming it, I think we would see this as a positive, and wouldn’t talk in an alarmed way about the specific percentages of the town’s water the industry was using. We should think of AI like we do with any other industry. In 2024 all AI in America collectively consumed as much water as the collective lifestyles of the residents of Paterson, New Jersey.
If you want to look at the overall carbon emissions of artificial intelligence instead of just water usage, the bottom line is even worse: human beings are as much as 1,500 times as wasteful as AI when it comes to writing text and 2,900 times as wasteful in creating images, according to a recent study. It found that AI systems "emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts." The study thoughtfully notes that this analysis doesn't account for "social impacts such as professional displacement, legality, and rebound effects," and adds that "AI is not a substitute for all human tasks," which is reassuring.

Should we be concerned about the energy and resource consumption of AI engines or data centers in general? Obviously yes, especially when the number of data centers is increasing rapidly. But it's worth noting that, according to Google, over the past year the energy and total carbon footprint of Gemini has dropped by 33 times and 44 times, respectively. There are plenty of concerns worth raising about artificial intelligence or machine learning or whatever term you want to use for ChatGPT and Gemini and Claude and Perplexity, whether it's the impact they have on creative industries or the copyright issues with indexing, or the proliferation of AI-powered chatbots doing therapy – sometimes badly. But in the larger scheme of human endeavor, the industry's energy and water use doesn't seem that out of proportion when compared to other industries or areas of human activity. We would be better to raise a hue and cry about golf!
Got any thoughts or comments? Feel free to either leave them here, or post them on Substack or on my website, or you can also reach me on Twitter, Threads, BlueSky or Mastodon. And thanks for being a reader.