Does AI Really Use 10 Gallons of Water Per Image?

The viral claim is wrong by 100-600x. Here's what the peer-reviewed research actually says about AI water consumption.

← Back to Articles
AI data center water usage political cartoon illustration

Someone told me I wasted 10 gallons of water making one of those trending AI caricature images. So I did what any reasonable person would do: I pulled the peer-reviewed research and did the math myself.

Spoiler: the "10 gallons per image" claim isn't supported by any credible source. The actual number is closer to 15-60 milliliters — about a shot glass worth of water. That makes the viral claim exaggerated by 100-600x.

Let's break down where the water actually goes, what the research says, and why this myth spread so fast.

How Data Center Cooling Actually Works

Data centers do use water, primarily for cooling. The most common method is evaporative cooling towers, which work by spraying water over fill media while fans draw air across. About 1% of the circulating water evaporates, carrying heat away from the servers. This evaporated water — roughly 80% of total water withdrawn — is what data centers actually "consume."

But here's what most headlines miss: AI workloads are driving a massive shift toward water-free cooling.

Traditional servers generate 3-8 kilowatts per rack. NVIDIA's latest AI systems hit 120-132 kW per rack. At those densities, air cooling fails completely — you'd need hurricane-force airflow to dissipate the heat. This is pushing the industry toward direct-to-chip liquid cooling, where coolant flows through microchannels attached directly to GPUs in closed-loop systems that consume no water at all.

Microsoft's Zero-Water Commitment

In August 2024, Microsoft announced all new data center designs will use zero-water evaporated cooling. Google's Finland facility already uses Baltic Sea water in a closed loop, consuming minimal freshwater. The "AI drains lakes" narrative is increasingly outdated.

What the Peer-Reviewed Research Actually Says

The most-cited study on AI water consumption comes from UC Riverside researchers (Li, Yang, Islam, and Ren — "Making AI Less Thirsty," 2023). Their key findings:

Critically, no peer-reviewed study has measured water usage specifically for AI image generation. The "10 gallons per image" figure appears to be viral misinformation without academic backing.

Doing the Math on Image Generation

Working backward from the best available energy data: Hugging Face and Carnegie Mellon researchers measured image generation at 0.003-0.012 kWh per image. Applying the standard water conversion factor (~5 liters per kWh for cooling + electricity generation), this yields:

Per-Image Water Estimate

15-60 milliliters per AI-generated image — roughly what fills a shot glass or two.

OpenAI CEO Sam Altman stated in June 2025 that the average ChatGPT query uses approximately 0.3 milliliters of water. Google disclosed that median Gemini text prompts consume just 0.26 milliliters — about five drops.

Putting AI Water Use in Perspective

Individual AI queries have negligible water footprints compared to everyday activities:

🍔
One Hamburger
2,000-3,000 L
= 30,000-200,000 AI images
👕
One Cotton T-Shirt
2,700 L
= 45,000-180,000 AI images
🥑
One Avocado
227 L
= 3,800-15,000 AI images
🥜
One Almond
5 L
= 80-300 AI images

At the macro level, all U.S. data centers combined use approximately 50 million gallons per day for on-site cooling. AI represents roughly 15-20% of data center energy, suggesting ~10 million gallons daily for AI specifically.

For comparison: U.S. golf courses use 2 billion gallons daily. Residential lawns consume 9 billion gallons. Livestock operations require 200+ billion gallons. Data centers represent less than 1% of U.S. freshwater consumption.

Provider Efficiency Varies Dramatically

Water Usage Effectiveness (WUE) — measured in liters per kilowatt-hour — reveals stark geographic differences:

Provider Global WUE (2024-25) Best Regional WUE Notes
AWS 0.15 L/kWh Europe sites 40% improvement since 2021
Microsoft 0.27 L/kWh EMEA: 0.03 L/kWh Arizona: 1.52 L/kWh (hot climate)
Google ~1.1 L/kWh (est.) Finland: near-zero Fleet-wide PUE: 1.09
Meta 0.20 L/kWh Closed-loop: 0 80% more efficient than average
Industry Average 1.8 L/kWh Older facilities

Nordic data centers achieve near-zero water consumption using free-air cooling and fjord/seawater. Green Mountain's Norwegian facility uses 8°C fjord water from 100 meters deep in a closed loop, consuming effectively zero freshwater.

How the Myth Spread

The "10 gallons per image" claim likely emerged from several documented sources of exaggeration:

  1. Conflating training with inference: GPT-3 training consumed 700,000 liters total — but this one-time cost gets incorrectly distributed across individual queries
  2. Outdated efficiency assumptions: 2023 GPT-3 data applied to 2025 models, despite 10-33x efficiency improvements
  3. Worst-case geographic scenarios: Arizona water usage applied globally, ignoring zero-water Nordic facilities
  4. Including indirect water: Some calculations count evaporation from hydroelectric reservoirs thousands of miles away
  5. Unit conversion errors: Documented cases include claims that were later corrected as off by a factor of 1,000

The UC Riverside researchers themselves noted their estimates "vary 30x+" depending on scope boundaries, model version, query complexity, and data center location. Media coverage frequently selected worst-case assumptions while presenting them as typical.

The Bottom Line

AI infrastructure does consume water and energy at scale, and rapid growth creates genuine sustainability challenges. These aggregate concerns deserve attention.

However, the "10 gallons per image" claim is not supported by any peer-reviewed research and contradicts available energy measurements by 100-600x.

The actual water footprint per AI-generated image — approximately 15-60 milliliters — is comparable to a few sips of water. Your lunch today probably used more water than a year of your AI usage.

If we want to have serious conversations about AI's environmental impact, we need serious numbers — not viral headlines.

Sources & Further Reading

  • Li, Yang, Islam, Ren — "Making AI Less Thirsty: Uncovering and Addressing the Secret Water Footprint of AI Models" (UC Riverside, 2023) — arXiv
  • OpenAI Image Token Documentation — platform.openai.com
  • Epoch AI — "How Much Energy Does ChatGPT Use?" — epoch.ai
  • Microsoft — "Sustainable by Design: Next-Generation Datacenters Consume Zero Water for Cooling" (Dec 2024) — microsoft.com
  • Uptime Institute — "Ignore Data Center Water Consumption at Your Own Peril" — uptimeinstitute.com
  • EESI — "Data Centers and Water Consumption" — eesi.org
TB

Tim Bish

Tim runs Understanding Your AI, providing AI education and training to business teams across Michigan's Great Lakes Bay Region. He cuts through the hype to deliver practical, actionable AI insights.