Before We Bet the Grid on Mega Data Centres, Let's Ask a Hard Question
Investing in mega data centres
The numbers are staggering. Global investment in data centres reached half a trillion dollars in 2024. The four largest hyperscalers collectively committed over $370 billion in 2025 alone. McKinsey estimates that reaching 200 GW of global AI-specific capacity by 2030 will require somewhere between $5 and $7 trillion in cumulative capital. Governments are treating data centre capacity as strategic infrastructure, the way earlier generations treated railways or nuclear arsenals.
This is the consensus position. And it may be right. But consensus positions in technology investment have a poor track record, and this one deserves far more scrutiny than it is currently receiving.
The Trajectory Nobody Is Questioning Loudly Enough
Figure 1. Interior of a Data Centre (CC 2026)
Start with the demand side. The International Energy Agency projects that global data centre electricity consumption will more than double by 2030, reaching around 945 TWh. That is roughly equivalent to Japan’s entire national electricity consumption today. Goldman Sachs forecasts data centres will account for 8% of total US power demand by 2030, up from around 3% in 2022. In the United States alone, data centres are projected to account for nearly half of all electricity demand growth over the next five years.
These projections are alarming not because growth is inherently bad, but because the infrastructure required to underwrite it is being built on an assumption that has never held in the history of resource-intensive technology: that gains in computing efficiency will reduce, rather than amplify, total consumption.
That assumption is wrong.
The Paradox at the Centre of the Efficiency Argument
The dominant defence of mega data centre investment runs like this: yes, current AI models are energy-hungry, but algorithmic breakthroughs are coming. DeepSeek’s training costs were a fraction of those of its predecessors. Model distillation and quantisation can dramatically reduce inference costs. Smarter architectures will do more with less. So, we should build now and trust efficiency to mop up the environmental bill later.
This argument is not simply optimistic. It is structurally flawed.
In 1865, the economist William Stanley Jevons observed that improvements in steam engine efficiency had not reduced Britain’s coal consumption. They had tripled it. As coal became cheaper to use, it became accessible to new industries and wider applications. Total consumption exploded. The same logic applies with ruthless consistency across resource economics.
The digital version of this paradox is now documented in peer-reviewed research. A 2025 study published in Nature Cities confirmed empirically that algorithmic efficiency gains in metropolitan data centres may enlarge, rather than shrink, the energy footprint of AI. When Microsoft CEO Satya Nadella responded to DeepSeek’s efficiency breakthrough with the comment, “Jevons paradox strikes again,” he was not issuing a warning. He was cheering.
And that is the problem. Cheaper computing does not mean less demand for computing. It means computing floods into use cases previously too expensive to contemplate. More efficient AI democratises access to AI, which expands AI adoption, which drives more data centre construction. The record of the past decade bears this out: data centre electricity use has grown at roughly 12% annually despite continuous improvements in hardware efficiency.
If efficiency does not contain the footprint, something else must. And right now, nothing else is.
Water: The Hidden Catastrophe
Energy is the headline risk. Water is the crisis that rarely makes the headlines.
An average large data centre consumes around 450,000 gallons of water per day for cooling. A one-megawatt facility can consume up to 25.5 million litres per year. By 2028, AI data centres globally could use as much water as 18.5 million households, according to a 2026 Food and Water Watch analysis. The UN Environment Programme projects global AI water demand will reach 4.2 to 6.6 billion cubic metres by 2027.
Figure 2. Many Data Centres are being built in water-constrained areas
This is not an abstraction. Around two-thirds of US data centres built since 2022 are located in water-stressed regions. In Australia, a country not unfamiliar with drought, Sydney Water has warned that data centres could consume one-quarter of the city’s usable water supply by 2035 if current development plans proceed. Melbourne’s existing and planned data centres are projected to consume water equivalent to the annual needs of 330,000 residents. Residents in Newton County, Georgia, reportedly experienced dry taps as nearby data centres came online.
The social licence problem is acute. A community cannot drink server cooling. When data centres and households compete for a shrinking water supply, the community’s tolerance for the technology will erode regardless of what the economic models say.
Who Actually Pays?
The economics of mega data centres look attractive on a spreadsheet. The technology sector generates high gross value added per terawatt-hour of energy, GPU compute can command a substantial markup over its operating costs, and the industry generates jobs in construction and logistics.
But the costs are substantially socialised, while the profits are substantially private.
US residential electricity rates rose 31% between 2020 and 2025, a period that directly overlaps with the data centre construction boom. In Virginia, a major hub, data centre energy demand rose 30% from 2024 to 2025 alone. Research from Carnegie Mellon University estimates that data centres and cryptocurrency mining combined could increase the average US electricity bill by 8% by 2030, potentially exceeding 25% in the highest-demand markets. In Mesa, Arizona, Google reportedly paid $6.08 per 1,000 gallons of water, while residents paid $10.80.
Data centres require continuous, uninterruptible power and resist being curtailed during peak demand. In a grid emergency, this inflexibility is not a technical detail. It is a political problem that concentrates risk on the most vulnerable consumers.
The Assumption We Should Be Stress-Testing
None of this means data centres are not essential. They are. Centralised, high-density compute is the only architecture currently capable of training frontier AI models at scale. Healthcare AI, climate modelling, and genomics research all depend on stable compute backbones. Nations without domestic data centre capacity cede strategic autonomy in the AI economy.
But the current investment trajectory is not premised on need. It is premised on a particular theory of AI’s future: that compute scaling will remain the primary driver of capability for the foreseeable future, and that larger models will always outperform smaller, more efficient ones.
That theory is being actively challenged. Alternative architectures, including sparse mixture-of-experts models, neuromorphic chips, and edge-optimised inference systems, are maturing. Intel’s Loihi 2 neuromorphic chip reportedly offers roughly 1,000 times greater energy efficiency than conventional GPUs for AI inference. Distributed inference across edge devices, hospitals, and embedded systems is increasingly viable for workloads that do not require centralised training.
The most likely future is not hyperscale or nothing. It is hybrid: centralised facilities for training, distributed edge infrastructure for inference. The question is whether the $5 to $7 trillion being allocated to hyperscale construction is calibrated to that hybrid future, or to a future that may not arrive.
What Caution Actually Looks Like
None of this is an argument against data centres. It is an argument against uncritical overcommitment to a single infrastructure model at a scale and pace that outstrips our capacity to manage the consequences.
Caution means insisting on mandatory renewable energy sourcing before approvals, not voluntary pledges. It means binding water-use assessments as a prerequisite for siting, not an afterthought negotiated after construction begins. It means designing regulatory frameworks that prevent energy costs from being socialised onto residential consumers who have no say in data centre siting decisions. It means funding serious research into edge and neuromorphic architectures that could change the structural demand picture.
And it means being honest about the Jevons Paradox. Efficiency gains will not save us from the consequences of building more. They will enable us to build more and consume more. If the aggregate environmental and social footprint of that expansion is acceptable, make the case honestly. Do not dress up a resource extraction bet as a sustainability strategy.
The AI economy needs compute infrastructure. It does not need $7 trillion worth of infrastructure built in five years on the assumption that nothing about the technology will change, that efficiency will contain demand, and that communities will bear costs indefinitely without pushing back.
Caution is not the enemy of progress. Unexamined momentum usually is.
References
1 International Energy Agency (2025). Energy and AI. IEA, Paris. https://www.iea.org/reports/energy-and-ai
2 Goldman Sachs Research (2024). AI is poised to drive 160% increase in data center power demand. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
3 Goldman Sachs Research (2025). AI to drive 165% increase in data center power demand by 2030. https://www.goldmansachs.com/insights/articles/ai-to-drive-165-increase-in-data-center-power-demand-by-2030
4 McKinsey & Company (2025). The cost of compute: A $7 trillion race to scale data centers. https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centers
5 Wu, J. (2025). Digital Jevons paradox in urban data center energy systems. Nature Cities, 2, 677. https://doi.org/10.1038/s44284-025-00289-9
6 Wang, W. (2025). The Jevons Paradox: Why efficiency alone won’t solve our data center carbon challenge. ACM SIGARCH Computer Architecture News. https://www.sigarch.org/the-jevons-paradox-why-efficiency-alone-wont-solve-our-data-center-carbon-challenge/
7 Food and Water Watch (2026). The urgent case against data centers. https://www.foodandwaterwatch.org/wp-content/uploads/2026/03/RPT2_2602_DataCenterMoratorium.pdf
8 Pew Research Center (2025). US data centers’ energy use amid the artificial intelligence boom. https://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/
9 Carbon Brief (2025). AI: Five charts that put data-centre energy use and emissions into context. https://www.carbonbrief.org/ai-five-charts-that-put-data-centre-energy-use-and-emissions-into-context/
10 Australian Broadcasting Corporation (2025, December 10). Demand for water cannot be an ‘afterthought’ in AI push. https://www.abc.net.au/news/2025-12-10/demand-for-data-centre-water-in-ai-push/106102208
11 BLS Strategies (2026). Community opposition is reshaping data center strategy. https://blsstrategies.com/insights-press/community-opposition-is-reshaping-data-center-strategy
12 IEA (2025). World Energy Outlook 2025: Executive Summary. https://www.iea.org/reports/world-energy-outlook-2025/executive-summary



