There is a growing narrative in the media — and increasingly on LinkedIn — that AI-driven data centres are pushing us toward “water bankruptcy.”
In some versions, this is framed as an existential crisis driven almost entirely by AI’s rapid growth.
We wanted to understand whether this claim stands up to scrutiny. Rather than relying on headlines, we spent time looking at the underlying mechanics of data centre water usage — specifically water used for cooling, which is where most of the concern is focused.
Data centre water usage is complex and influenced by multiple factors. For clarity, this analysis looks only at water used for cooling operations. It explicitly excludes:
Those are important topics, but they deserve separate treatment.
Using publicly available information, we looked at ten of the largest US data centres by square footage and estimated power capacity:
This represents approximately 3.8 gigawatts of capacity.
For context, total US data centre capacity is estimated at 40–50 GW and is expected to roughly double by the end of the decade. This sample therefore represents less than 10% of current US capacity.
Because operators do not publicly disclose detailed operating data, we made several high level assumptions:
- Traditional systems used ~2.97 litres of water per kWh.
- Modern systems have reduced this to ~0.3 litres per kWh.
These assumptions materially change the outcome and are critical to interpreting the results.
Under these assumptions, the ten data centres analysed require:
Even when extrapolated across the broader US data centre footprint — and then globally — the numbers are undeniably large. The US is estimated to account for around 50% of global data centre capacity.
However, two things stand out:
There will, of course, be many older, less efficient facilities still in operation. This analysis should therefore be viewed as a minimum baseline, not a best or worst case scenario.
If AI driven data centre capacity continues to double every few years, water will certainly remain an important constraint.
But it may not be the first one we hit.
Generating renewable energy at the required scale — reliably, affordably, and fast enough — appears to be a significantly larger challenge than cooling water availability alone.
One final point rarely discussed: how AI capacity is actually being used.
Public usage data suggests that a large share of consumer AI activity is focused on chat based interactions — effectively a more powerful version of traditional search.
If consumers were required to pay an explicit environmental impact cost for AI usage, would demand remain this high? Or is current behaviour driven largely by convenience, with little visibility of the underlying resource trade offs?
AI-driven data centres do use a lot of water. That is not in dispute.
But the picture is more nuanced than many headlines suggest. Cooling efficiency has improved substantially, water reuse is now standard in modern facilities, and energy generation may prove to be the tighter constraint.
The real risk may not be AI itself — but how thoughtlessly we choose to use it.