Thirsty Robot

The Hidden Insatiable Thirst of AI: Unveiling the Water Footprint of Intelligence

Can you guess how much water the AI model consumed to generate the featured image of a thirsty robot for this article? Read below for answers.

In the digital age, artificial intelligence (AI) has become synonymous with innovation and efficiency. Yet, behind the seamless interactions and rapid computations lies an often-overlooked environmental cost: water consumption. As AI systems like ChatGPT become integral to various sectors, understanding their water footprint is crucial.

The Invisible Drain: Water and AI

AI models, particularly large language models (LLMs), require substantial computational power. This power generates heat, necessitating cooling systems in data centers. Many facilities employ evaporative cooling, which consumes significant amounts of water. Once used, this water evaporates and cannot be reclaimed, contributing to the overall water footprint of AI operations.

A 2023 study by researchers at the University of California, Riverside, and the University of Texas at Arlington highlighted this issue, estimating that training GPT-3 in Microsoft’s U.S. data centers could consume approximately 700,000 liters of freshwater. The study also projected that by 2027, global AI water consumption could reach 4.2 to 6.6 billion cubic meters, surpassing the annual water withdrawal of countries like Denmark.

Quantifying the Cost: Insights from Industry Leaders

OpenAI CEO Sam Altman has addressed concerns about AI’s environmental impact. In a June 10, 2025 blog post, The Gentle Singularity, he revealed that an average ChatGPT query uses about 0.000085 gallons of water—roughly one-fifteenth of a teaspoon. While this figure seems minimal, the cumulative effect across billions of queries is substantial.

Altman also noted that each query consumes approximately 0.34 watt-hours of energy, comparable to powering a high-efficiency lightbulb for a few minutes.

“As datacenter production gets automated,” Altman writes, “the cost of intelligence should eventually converge to near the cost of electricity. (People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.)”

Despite these assertions, experts argue for greater transparency and detailed data to assess the broader sustainability of AI technologies. Environmental advocates stress the need for comprehensive reporting on water usage, especially in regions facing water scarcity.

The Broader Implications: AI’s Environmental Footprint

The environmental impact of AI extends beyond water consumption. Data centers contribute to carbon emissions, with the energy required for training and operating AI models often sourced from fossil fuels. A 2024 report indicated that training large AI models like GPT-3 released approximately 552 metric tons of carbon dioxide, equivalent to the annual emissions of 123 passenger vehicles.
Moreover, the location of data centers plays a critical role. Facilities situated in drought-prone areas exacerbate local water shortages. For instance, data centers in Virginia’s “Data Center Alley” have seen water usage surge by nearly two-thirds between 2019 and 2023, raising concerns among environmentalists and local communities.

The Risk No One Talks About

In the context of financial services—Sedric’s domain, where I lead marketing—this might not be an abstract concern. Rather; it’s a compliance issue hiding in the skin of an engineering detail.

Consider a global bank adopting large-scale AI for client communication, transaction monitoring, or internal compliance might find itself hosting those models in a jurisdiction where water is scarce. What begins as a procurement decision may end up a reputational vulnerability. Stakeholders, regulators, and even ESG auditors may ask: Did you know what your infrastructure was doing to the local watershed?

This is not to suggest catastrophe. But it is to suggest complexity—something compliance teams are not just trained to handle, but ethically required to anticipate. Just this year many parts of Northern and Western Europe are suddenly in a state of drought. While people are being told to scale back water usage, what about our relentless AI queries? In 2024, Europe’s data center industry consumed about 62 million cubic meters of water—the equivalent to roughly 24,000 Olympic swimming pools. Until there is pressure to find more efficient methods, the thirst for water resources will only grow.

Navigating the Future: Sustainable AI Practices

Addressing AI’s environmental impact requires a multifaceted approach:

  1. Innovative Cooling Solutions: Companies are exploring alternative cooling methods, such as air cooling and liquid immersion cooling, to reduce water usage.
  2. Renewable Energy Integration: Transitioning data centers to renewable energy sources can mitigate carbon emissions associated with AI operations.
  3. Strategic Data Center Placement: Locating data centers in regions with abundant water resources and cooler climates can alleviate pressure on water-stressed areas.
  4. Transparency and Reporting: Implementing standardized reporting on water and energy usage can foster accountability and inform sustainable practices.

Balancing Progress and Sustainability

As AI continues to revolutionize industries, it’s imperative to balance technological advancement with environmental stewardship. Recognizing and addressing the water footprint of AI is a critical step toward sustainable innovation. At Sedric, we advocate for responsible AI development that considers not only performance metrics but also ecological impacts. By fostering transparency and embracing sustainable practices, we can ensure that the growth of AI aligns with the health of our planet.

Prologue (for my Science fam)

So, just how much water did the AI model consume to generate an image of a thirsty robot? Let’s jump into the numbers.

Assumptions and Methodology

1. Energy Cost of One Image Generation

Based on various benchmarks, generating a single high-resolution DALL·E image (1024×1024 or larger) consumes approximately 2.9 to 5.4 watt-hours (Wh) of electricity, depending on server load and GPU type.

2. Water Consumption per kWh

Water usage depends on:

Evaporative cooling at data centers

Water used in electricity production (if fossil or thermal-based)

Average combined water use (direct + indirect) in U.S. data centers:
~1.8 liters per kWh 

Estimated Water Use

We apply the formula:

Water (liters) = Energy (kWh) × 1.8 L/kWh

For 2.9 Wh:
0.0029 kWh × 1.8 L/kWh = ~5.2 milliliters

For 5.4 Wh:
0.0054 kWh × 1.8 L/kWh = ~9.7 milliliters

Result:

Estimated water used to generate the robot image:
~5–10 milliliters of freshwater

That’s roughly one to two teaspoons of water — enough to hydrate a parched AI for a single, satisfying sip.