Renewable Energy
May 10

The Internet's Dirty Secret | How Data Centres Became the Defining Environmental Crisis of the AI Age

Data centre energy consumption is accelerating faster than renewables can keep up. From the 415 terawatt hours already being consumed annually, to water shortages in small American towns, to Big Tech's nuclear pivot, this piece maps the full environmental cost of the AI infrastructure boom and asks who is really paying for it.

I want to tell you about a building you've probably never thought about, but have almost certainly used today.

It has no windows. No signage worth speaking of. It hums at a frequency you feel more than hear. And right now, as you read this, it is drawing enough electricity to power tens of thousands of homes. Not to heat them, not to light them, but to process the requests of people asking an AI chatbot what to make for dinner, or running a search query, or streaming a film they'll turn off after twenty minutes.

That building is a data centre. There are thousands of them. And they are growing faster than almost any infrastructure in human history.

The conversation around data centres and energy tends to get framed as a technology story, a story of innovation, of progress, of the inevitable march of the digital economy. I think that framing lets us off the hook too easily. Because what is actually happening, when you sit with the numbers, is something more uncomfortable. We are building a new industrial sector at breakneck pace, and we are only beginning to grapple with what that means for the grid, for water supplies, for climate targets, and for the communities that get chosen as hosts.

The Scale Is Hard to Comprehend

Let me try anyway.

Data centres currently consume around 415 terawatt hours of electricity every year, roughly 1.5% of all global electricity demand. That sounds modest, until you consider that this figure has grown at 12% per year for the last five years, and that the International Energy Agency now projects electricity consumption from data centres will double by 2030.

Put another way: by the end of this decade, the world's data centres will use as much electricity as India does today. India, a country of 1.4 billion people.

In the United States, the picture is even more concentrated. American data centres already account for around 4.4% of national electricity consumption. The Department of Energy projects that figure could reach 7 to 12% by 2028. In Loudoun County, Virginia, the world's single largest data centre market and home to a corridor of server farms so dense it has been nicknamed Data Centre Alley, these facilities already account for 24% of total local power consumption, surpassing even residential use.

And then there is Ireland. In a country that spent decades marketing itself as Europe's tech hub, data centres now consume around 21% of national electricity. The IEA has estimated that figure could reach 32% by 2026. New facilities are connecting directly to the gas grid. The government has, so far, declined to impose meaningful limits on new builds.

These are not edge cases. These are the numbers.

What Changed: The AI Inflection

For most of the internet's existence, the growth in data centre energy demand was real but manageable. Efficiency gains from better chips, smarter cooling, and more consolidated infrastructure roughly kept pace with the explosion in data traffic. Between 2010 and 2018, global internet traffic increased roughly tenfold while data centre energy use grew by only about 6%. The industry quietly congratulated itself on this decoupling.

Then came the large language models.

A standard Google search consumes around 0.3 watt-hours of electricity. A ChatGPT query consumes approximately 2.9 watt-hours. That is roughly ten times the energy for a single interaction. Now multiply that by the hundreds of millions of AI queries being processed every day, for writing assistance, image generation, code completion, customer service, legal research, medical diagnosis support, and the scale of the shift begins to come into focus.

The IEA's most recent analysis found that electricity demand from data centres soared by 17% in 2025 alone, with AI-focused facilities growing even faster. The five largest technology companies, Amazon, Microsoft, Google, Meta, and Apple, collectively spent over $400 billion on capital expenditure in 2025, a figure projected to increase by a further 75% in 2026.

I find it useful to think of this not as a continuation of the existing trend but as a phase transition. The data centre industry that existed before generative AI was one thing. What is being built now is something qualitatively different: a new category of heavy industrial infrastructure, consuming resources at a scale that belongs in the same conversation as steelmaking or cement production.

The Water Problem Nobody Is Talking About Enough

Energy gets most of the headlines. Water deserves more.

Cooling a data centre requires enormous quantities of it. The servers generate heat and that heat has to go somewhere. The most efficient way to remove it at scale is evaporative cooling, essentially allowing water to evaporate and carry the heat away with it. This works well thermodynamically. It is less elegant environmentally.

US data centres directly consumed approximately 17 billion gallons of water for cooling in 2023. According to Lawrence Berkeley National Laboratory, that figure could double or even quadruple by 2028 as AI-driven expansion continues.

The national average masks a more troubling local reality. In The Dalles, Oregon, Google's water use grew by 316% over a period when the town's own population grew by just 12%. Two-thirds of new hyperscale campuses built since 2022 are located in high water-stress counties. One in five servers in the United States sits in a water-stressed watershed.

These are not abstractions. In drought-prone regions already managing competing demands from agriculture, municipalities, and industry, the arrival of a hyperscale data centre that consumes millions of gallons per day represents a genuine and immediate pressure on shared resources.

Some companies are taking this seriously. Microsoft has cut potable water use by 97% at its Quincy, Washington facility and is investing in zero-water-evaporation designs. Google replenished 4.5 billion gallons of water in 2024 through conservation and watershed projects. These are meaningful steps. They are also the exception, not the rule.

The Renewable Energy Gap

The technology industry has, to its credit, been among the most aggressive corporate purchasers of renewable energy. Amazon, Microsoft, Meta, and Google have collectively contracted nearly 50 gigawatts of renewable capacity through power purchase agreements, equivalent to the entire generation capacity of Sweden.

And yet.

Goldman Sachs estimates that future electricity demand from data centres through 2030 will be met 60% by natural gas and only 40% by renewables. The reason is a mismatch in timing and geography. Renewable projects take years to permit and build. Grid connection queues are overloaded. Data centre developers under pressure to get capacity online quickly for AI workloads that are monetising now are making pragmatic choices: building diesel generators as backup, connecting to gas-fired peakers, or in some cases advancing projects with onsite natural gas generation that bypasses grid constraints entirely.

In Germany, Microsoft's expansion near Hambach sits adjacent to one of Europe's most contested coal mines. In the United States, data centre operators are advancing a significant number of projects with dedicated onsite gas generation, largely to circumvent the slow pace of grid connections.

There is also the question of what renewable procurement actually means in practice. When a hyperscaler announces it has matched 100% of its electricity consumption with renewables, this typically refers to the purchase of renewable energy certificates, accounting instruments that confirm that somewhere on the grid, the equivalent number of megawatt-hours was generated from clean sources. This is not the same as running on clean power in real time. The grid does not work that way. At 2am on a January night when the wind is not blowing, a data centre in Virginia draws power from whatever is generating, and in many parts of the United States, that still means coal or gas.

The more rigorous standard, 24/7 carbon-free energy matching consumption to clean generation in the same location hour by hour, is championed by Google and a handful of others. It remains far from universal.

Nuclear's Second Act

One of the more striking developments of the last two years has been the speed with which the technology industry has turned to nuclear power as a potential solution.

The logic is compelling. Nuclear plants generate large amounts of reliable, zero-carbon electricity regardless of weather conditions. They can be sited close to data centre campuses. And for an industry that needs guaranteed power delivery at scale, the firmness of nuclear baseload is worth paying a premium for.

The pipeline of agreements between data centre operators and small modular reactor projects has grown from 25 gigawatts at the end of 2024 to 45 gigawatts today, according to the IEA. Microsoft has signed agreements with Constellation Energy to restart the Three Mile Island plant. Amazon has backed SMR development. Google has contracted with Kairos Power for a fleet of small reactors.

Whether this represents a genuine transformation of the energy landscape or an elaborate exercise in option-buying will depend on execution. SMRs remain largely unproven at commercial scale. Regulatory timelines are long. Construction costs have historically overrun. The technology industry's appetite for nuclear should be welcomed, but with clear eyes about the gap between announced agreements and operating megawatts.

The Community Dimension

It is easy to discuss data centres as an abstract infrastructure question, gigawatts and terawatt-hours and carbon intensity metrics. It is worth occasionally remembering that these facilities are built in places where people live.

Research from Carnegie Mellon University estimates that data centres and cryptocurrency mining could increase average US electricity bills by 8% by 2030, potentially exceeding 25% in high-demand markets like Northern Virginia. The mechanism is straightforward: large concentrated loads drive the need for new generation capacity and grid investment, and those costs are ultimately socialised across ratepayers.

In communities that host data centres, the picture is genuinely mixed. The facilities bring construction jobs, property tax revenue, and in some cases long-term employment. They also bring noise, water use, and truck traffic during construction, along with growing concerns about whether the economic benefit is proportionate to the resource extraction. A data centre that employs 30 people full-time while consuming the electricity of 100,000 homes raises a legitimate question about the allocation of public infrastructure.

This is not an argument against data centres. It is an argument for taking the community calculus seriously, rather than treating it as a planning formality.

Is AI Part of the Solution?

It would be intellectually dishonest to write about data centres and energy without acknowledging the other side of this ledger.

The IEA has noted, carefully, that AI is becoming not just an energy consumer but an energy tool, being deployed to optimise grid management, accelerate materials discovery for better batteries, improve the accuracy of renewable energy forecasting, and reduce waste in industrial processes. Google's DeepMind has used AI to reduce cooling energy consumption at Google's own data centres by around 40%.

There is also the efficiency trajectory to consider. Power consumption per AI task is declining rapidly, at a rate the IEA has described as unprecedented in energy history. The models are getting more capable per unit of compute. Hardware efficiency is improving with each generation of chip. If these trends continue, the energy intensity of AI workloads may look very different in 2030 than it does today.

The problem is the rebound effect. As AI becomes cheaper and more efficient, more people use it, for more things, more often. The efficiency gains are real, and they are being outpaced by volume growth. This is a pattern familiar from the history of computing and from the history of energy more broadly. Efficiency rarely shrinks total consumption, because lower costs unlock new demand.

What Good Would Actually Look Like

I am not interested in the version of this story that ends with a call to stop building data centres. That is not a serious proposal. The digital economy is the economy, and the AI transition is happening regardless of how uncomfortable the energy implications are.

What I am interested in is the version of this story where the people building, financing, regulating, and hosting this infrastructure make choices with clear eyes rather than convenient ones.

That means genuine 24/7 clean energy commitments, not annual certificate matching. It means serious water management, particularly in water-stressed regions. It means transparent reporting, given that there is currently no comprehensive global dataset on data centre electricity consumption because few governments require it. It means honest conversations with host communities about the full cost-benefit picture. It means grid investment that keeps pace with demand, rather than developers defaulting to onsite gas because the public grid moves too slowly.

It also means, frankly, that some of us start asking harder questions about what we use AI for. Not every query needs to be answered by a large language model. Not every image needs to be generated. Not every document needs to be summarised by a neural network. The individual contribution of any single interaction is vanishingly small, but the aggregate shape of our habits is part of what builds the demand curve that data centres are being constructed to serve.

The infrastructure being built right now will be operating for twenty to thirty years. The decisions being made today, about where to site these facilities, how to power them, and how to account for their environmental impact, will shape the energy landscape for decades. That is a serious responsibility, and it deserves a more serious conversation than it is currently getting.

The building hums. The requests keep coming. Somewhere, a server processes a query, consumes its watt-hours, and moves on to the next one.

The question is what we decide to do while it's doing that.

Data sourced from the International Energy Agency (Energy and AI, 2025), Key Questions on Energy and AI (2026), Lawrence Berkeley National Laboratory, the US Department of Energy, Carnegie Mellon University, and Goldman Sachs research.

FAQs❓

1. How much electricity do data centres use globally?

+

Data centres currently consume around 415 terawatt hours of electricity per year, roughly 1.5% of all global electricity demand. That figure has grown at 12% per year over the last five years and is projected by the International Energy Agency to double by 2030.

2. Why has AI made data centre energy consumption so much worse?

+

A standard Google search uses around 0.3 watt-hours of electricity. A ChatGPT query uses approximately 2.9 watt-hours — roughly ten times as much. Multiply that across hundreds of millions of daily AI interactions and the cumulative demand becomes enormous. Electricity demand from AI-focused data centres surged 50% in 2025 alone.

3. How much water do data centres consume?

+

US data centres consumed approximately 17 billion gallons of water for cooling in 2023, according to Lawrence Berkeley National Laboratory. That figure could double or quadruple by 2028. A Congressional Research Service report estimates a 100-megawatt data centre consumes roughly the same water as 2,600 households.

4. Are data centres powered by renewable energy?

+

Major tech companies have contracted nearly 50 gigawatts of renewable capacity, but Goldman Sachs estimates that data centre demand through 2030 will still be met 60% by natural gas and only 40% by renewables. The gap comes down to timing — renewable projects take years to build while AI data centre demand is scaling now.

5. What does "100% renewable" actually mean for a data centre?

+

When a hyperscaler claims to match 100% of its electricity with renewables, it typically refers to the purchase of renewable energy certificates, not real-time clean power. The more rigorous standard, known as 24/7 carbon-free energy, matches consumption to clean generation hour by hour in the same location. Only a handful of companies currently meet it.

6. Why are tech companies investing in nuclear power?

+

Nuclear provides reliable, zero-carbon baseload electricity regardless of weather. The pipeline of agreements between data centre operators and small modular reactor projects has grown from 25 gigawatts to 45 gigawatts in under a year, according to the IEA. Microsoft, Google, and Amazon have all signed deals with nuclear developers.

7. How do data centres affect local electricity bills?

+

Research from Carnegie Mellon University estimates that data centres and cryptocurrency mining could increase average US electricity bills by 8% by 2030, with the impact potentially exceeding 25% in high-density markets like Northern Virginia. Large concentrated loads require new generation and grid investment, and those costs are typically passed to all ratepayers.

8. Which country is most affected by data centre energy demand?

+

Ireland is the most acute case. Data centres already consume around 21% of national electricity and the IEA estimates that figure could reach 32% by 2026. In Dublin, data centres account for nearly 80% of the city's electricity use. In the US, Loudoun County, Virginia, sees data centres consume 24% of local power, more than residential use.

9. Can AI help solve the energy problem it is creating?

+

In some ways, yes. AI is being used to optimise grid management, improve renewable energy forecasting, and reduce industrial waste. Google's DeepMind reduced cooling energy at its own data centres by around 40% using AI. But the rebound effect is real — as AI becomes more efficient and cheaper, usage grows faster than efficiency gains, keeping total energy demand on an upward trajectory.

10. Is data centre energy consumption being regulated?

+

Regulation is increasing but still patchy. The EU now requires data centres above 500kW to report energy consumption, water use, and waste heat. The IEA has noted there is still no comprehensive global dataset on data centre electricity consumption because few governments mandate reporting. The US has begun tracking growth more closely but has not yet imposed binding limits.

Greener Wisdom participates in various affiliate marketing programs, which means we may get paid commissions on products purchased through our links to retailer sites, including Amazon. As an Amazon Associate, we earn from qualifying purchases. This helps support our site and allows us to continue providing you with valuable content. Thank you for your support!

© 2026 Greener Wisdom. All rights reserved.
h2 { margin-top: 0; margin-bottom: 0; padding-top: 0; padding-bottom: 0; }