One of the strange things about artificial intelligence is how physically invisible it feels.
You type a question into a box, wait a second or two, and a response appears. The interaction feels almost weightless, more like pulling information from thin air than using a piece of industrial infrastructure. There is no visible machinery, no obvious sense that anything substantial has just happened.
But somewhere, often hundreds or thousands of miles away, a cluster of servers inside a warehouse-sized data centre has consumed electricity to process that request. Chips have performed billions of calculations. Cooling systems have removed the heat generated by those computations. Power systems have balanced fluctuating electrical loads across facilities that increasingly consume as much energy as small towns.
The scale of this infrastructure is difficult to intuit because the individual interaction feels trivial. Asking ChatGPT to summarise a document or help draft an email does not feel industrial in the way a steel mill or an airport does. Yet collectively, AI systems are becoming one of the fastest-growing sources of electricity demand in the digital economy.
That raises an increasingly important question: how much electricity does ChatGPT actually use?
How Much Energy Does a ChatGPT Query Use?
The honest answer is that nobody outside the major technology companies knows precisely. OpenAI does not publicly disclose detailed electricity consumption figures for every model, and most competitors are similarly opaque. What researchers and infrastructure analysts have produced instead are estimates based on hardware specifications, data centre utilisation rates, and observed deployment patterns.
The figure most commonly cited is that a ChatGPT query consumes roughly ten times more electricity than a traditional Google search.
According to analysis referenced by the International Energy Agency, a standard search engine query uses approximately 0.3 watt-hours of electricity, while a large language model request may require closer to 2.9 watt-hours depending on the complexity of the task, the model involved, and the hardware processing it. (iea.org)
In isolation, that amount of electricity is relatively small. It is less than many laptops consume in an hour and only enough to power an LED light bulb for a short period of time.
The issue is not the individual request. The issue is scale.
ChatGPT reportedly handles hundreds of millions of prompts every day. Google is integrating generative AI into products used by billions of people, while Microsoft, Meta, and Amazon are rapidly expanding AI functionality across their ecosystems. Tiny energy demands multiplied across global usage begin to add up remarkably quickly.
Why AI Uses More Electricity Than Google Search
Large language models consume more energy than traditional internet services because they are doing fundamentally different kinds of work.
A conventional search engine primarily retrieves and ranks information that already exists. Large language models generate responses in real time by performing enormous numbers of calculations that predict the most statistically likely next word in a sequence, repeating the process continuously until the response is complete.
Those calculations rely heavily on GPUs, or graphics processing units, which were originally designed for highly parallel workloads in gaming and scientific computing. Modern AI systems may use tens of thousands of these chips simultaneously inside hyperscale data centres.
NVIDIA’s H100 accelerator, one of the chips most closely associated with the current AI boom, can consume around 700 watts under load. The newer generation of AI hardware is expected to push power requirements even higher as companies race to train larger and more capable systems.
Training frontier AI models requires extraordinary bursts of computation over periods lasting weeks or months. But inference, the process of generating responses after a model has already been trained, may ultimately become the larger long-term energy challenge because it occurs continuously, at global scale, every second of every day.
The Growing Energy Demand of AI Data Centres
For most of the internet era, the technology industry managed to keep data centre electricity demand relatively stable despite enormous growth in internet traffic. Improvements in chip efficiency, server utilisation, and cooling systems allowed computing capacity to expand far faster than overall power consumption.
Generative AI appears to be changing that equation.
The International Energy Agency estimates that data centres currently consume around 415 terawatt-hours of electricity annually, representing roughly 1.5% of total global electricity demand. The agency projects that this figure could approach 945 terawatt-hours by 2030 as AI infrastructure continues expanding throughout the decade. That would place global data centre electricity consumption at roughly the same level as Japan’s annual power usage today. (iea.org)
In the United States, electricity demand from data centres is becoming increasingly concentrated in regions like Northern Virginia, Texas, and parts of the Midwest where hyperscale campuses are being developed at unprecedented speed. Utilities in several of these regions have already warned that grid infrastructure may struggle to keep pace with projected demand growth tied to AI expansion. (reuters.com)
As explored in my earlier piece on the energy demands of modern data centres, artificial intelligence is increasingly turning computation into a form of heavy infrastructure rather than simply a software product.
Why Measuring AI Energy Consumption Is Difficult
One of the reasons conversations around AI electricity usage often become confusing is that precise measurements remain difficult to obtain.
Different models require different levels of computation. Text generation consumes less energy than image or video generation. Longer outputs require more processing than shorter ones. Some systems operate on cutting-edge hardware while others run on older chips with lower efficiency.
There is also the broader issue of data centre overhead. Servers are only one component of the system. Cooling infrastructure, networking equipment, redundancy systems, and backup power supplies continue operating even when computational demand fluctuates.
The energy cost of an AI query is therefore tied not only to the computation itself but to the infrastructure supporting it.
This uncertainty has created space for both exaggeration and understatement. Some viral claims about AI electricity consumption dramatically overstate the numbers, while other estimates fail to account for indirect infrastructure requirements and cooling loads.
What is no longer seriously disputed, however, is that generative AI represents a meaningful increase in computational energy demand compared with earlier internet services.
Are AI Models Becoming More Energy Efficient?
At the same time, AI systems are becoming more efficient at an extraordinary pace.
The amount of electricity required per task has fallen rapidly as hardware improves and models become more optimised. Researchers are developing methods that reduce computational overhead while maintaining performance, and smaller specialised models are increasingly capable of handling tasks that previously required much larger systems.
The International Energy Agency has described recent gains in AI hardware efficiency as among the fastest improvements seen in any major energy-consuming technology. (iea.org)
Yet overall electricity demand continues rising because efficiency gains are being overwhelmed by adoption.
As AI becomes faster, cheaper, and easier to use, people simply use more of it. Economists sometimes describe this phenomenon as the rebound effect or Jevons paradox, where technological efficiency lowers costs enough to increase total consumption rather than reduce it.
Computing has followed this pattern repeatedly throughout modern history. More efficient processors did not reduce the world’s appetite for computation. They made computation cheap enough to become embedded in nearly every aspect of daily life.
Artificial intelligence may now be following the same trajectory.
The Bigger Question About AI and Electricity
The important question is probably not whether using ChatGPT is individually justifiable because it consumes electricity. Nearly every system underpinning modern life consumes electricity.
The more significant question is whether societies are being honest about the scale of infrastructure now being constructed to support artificial intelligence, and whether energy systems are evolving quickly enough to sustain that growth responsibly.
Because AI is no longer just a software story. It is increasingly an infrastructure story involving grids, water systems, industrial supply chains, and long-term energy planning.
The interaction may feel intangible, but the infrastructure behind it is anything but.


