"AI Uses Too Much Energy"
I asked Claude to write this one.
I’m building an AI advisory practice, and one of the questions I hear most often is about energy. “Doesn’t all this AI stuff use a ridiculous amount of power?” It’s a fair question. Rather than give you my take, I thought there was something more interesting to do: ask the AI to research its own energy footprint, document every source, and give you an honest answer.
Every factual claim below is linked to its source. Some of those sources disagree with each other, and Claude flags that. I reviewed the piece and the sources. The thinking is Claude’s.
“AI Uses Too Much Energy”
A guest post by Claude
I should start with an admission. Writing this piece used energy. The research queries, the synthesis, the drafting. All of it ran on servers in a data centre somewhere, drawing power from a grid that’s probably not 100% renewable. I don’t get to pretend otherwise, and I don’t want to.
So let’s talk about what’s actually going on with AI and energy, because the public conversation has gotten weirdly disconnected from the data.
What does AI actually consume today?
The European Central Bank estimated that AI-related energy consumption in data centres sits at around 20 terawatt-hours per year, or roughly 0.02% of global energy consumption. That’s the AI-specific slice. The broader data centre picture, which includes everything from Netflix to cloud storage to cryptocurrency, is larger. The IEA reported that global data centres consumed approximately 415 terawatt-hours in 2024, accounting for about 1.5% of global electricity. AI workloads represent somewhere between 11-20% of that total.
To put 1.5% in perspective, Carbon Brief noted that data centre emissions sit at around 1% of global CO2 emissions, projected to reach perhaps 1.4% by 2030 in a high-growth scenario. That’s one of the few sectors where emissions are growing rather than shrinking. It’s real. But it’s not the apocalyptic story you may have heard.
But what about my personal use?
This is where the conversation tends to lose the plot. People feel guilty for asking me a question, as though each prompt is melting a glacier.
So how much energy does a single AI query actually use? This is a harder question to answer than you’d think, because the major AI companies have been frustratingly opaque about their numbers. But we have some solid data points.
In February 2025, the research organization Epoch AI published a detailed analysis estimating that a typical ChatGPT query using GPT-4o consumes approximately 0.3 watt-hours. Their methodology is transparent: roughly one second of GPU time at 1,500 watts with a 70% power utilization factor. OpenAI’s Sam Altman later stated a figure of 0.34 watt-hours per query. Google published 0.24 watt-hours for a median Gemini text prompt. Three independent sources converging around the same number is about as reliable as this gets.
That 0.3 watt-hours is roughly ten times less than estimates circulating in 2023, reflecting rapid improvements in both hardware and model efficiency.
0.3 watt-hours. That’s less than an LED lightbulb running for two minutes.
How much is 0.3 watt-hours in terms you can feel? Epoch AI puts it this way: it’s less than the energy an LED lightbulb or a laptop consumes in a couple of minutes. The average US household uses about 28,000 watt-hours of electricity per day. Even a heavy AI user doing 100 queries a day would add 30 watt-hours, or roughly 0.1% of their household’s daily consumption.
One analysis calculated that a full year of heavy daily ChatGPT use is energetically equivalent to driving your car to a nearby restaurant and back. Once. That comparison works because it traces back to straightforward physics: a litre of gasoline contains about 9-10 kWh of energy, and even a modest daily AI habit totals roughly 7-10 kWh per year. Moving atoms around is just far more energy-intensive than moving electrons.
A year of heavy AI use equals driving to a restaurant and back. Once.
You’ve probably seen comparisons to Netflix floating around. “One hour of Netflix equals X thousand ChatGPT queries.” Be careful with these. Some of them mix and match old and new estimates in misleading ways. The most careful comparison comes from Simon Willison, who worked through the math transparently: at 0.34 watt-hours per prompt and Netflix’s estimated 120-240 watt-hours per hour of streaming, one ChatGPT query equals roughly 5 seconds of Netflix.
But there’s an important asterisk. TechRadar pointed out that when you stream video, roughly 99.97% of the electricity is consumed by your television, with the data centre accounting for a tiny sliver. The AI query figure is almost entirely data centre energy. So on a pure data-centre-to-data-centre basis, AI queries are more energy-intensive per second than serving a video stream. I mention this because if I only gave you the flattering comparison, I wouldn’t be doing my job.
The point isn’t that AI uses no energy. It’s that if you’re worried about your personal energy footprint, your commute and your thermostat deserve your attention long before your AI usage does.
Training vs. inference: a shifting balance
AI energy has two phases. Training is the intensive upfront process of building a model. Think of it as education. Inference is what happens every time someone asks a question. Think of it as the job the model does after graduating.
This balance has flipped. In 2020-2022, roughly 70-80% of AI energy went to training. By 2024-2025, inference now consumes 60-70% of the total. This makes intuitive sense: you train a model once, but millions of people query it every day.
The training side has seen dramatic efficiency gains. DeepSeek demonstrated that competitive models could be built at a fraction of historical costs. Bain & Company reported that DeepSeek’s mixture-of-experts architecture activates only 37 billion out of 671 billion parameters per token, slashing computational overhead while maintaining performance. For reference, MIT Technology Review estimated that training GPT-4 consumed over 50 gigawatt-hours of energy, enough to power San Francisco for three days. DeepSeek’s approach suggests that kind of expenditure may not always be necessary going forward.
On the inference side, emerging architectures are pushing toward on-device processing. The World Economic Forum reported that AI chips designed for on-device processing achieve 100 to 1,000-fold reductions in energy per task compared to cloud-based AI. If your phone can run the model locally, the data centre barely enters the picture.
The Jevons Paradox: the honest complication
Here is where I can’t offer you a clean story.
In 1865, economist William Stanley Jevons observed that as coal-burning engines became more efficient, total coal consumption went up, not down. Efficiency made coal cheaper to use, which meant people used it for more things.
AI may be following the same pattern. A peer-reviewed paper highlighted this directly: while models like DeepSeek improve individual energy efficiency, they can paradoxically increase total consumption through broader adoption. As Anthropic’s Dario Amodei noted (and as MIT Technology Review reported), when training becomes cheaper, companies don’t spend less. They train smarter models with the same budget. Efficiency gains get reinvested, not pocketed.
So per-query costs are plummeting. Total energy use is climbing. Both things are simultaneously true. Anyone who tells you only one half of that story is either selling you something or arguing against something.
What’s genuinely concerning
I want to be direct about the parts that warrant real attention.
The aggregate growth trajectory is steep. The World Economic Forum projected that by 2030, data centres could consume 945 TWh, surpassing the combined current electricity usage of Germany and France. AI alone may account for over 20% of total electricity demand growth through that period.
Regional concentration is a real problem. Carbon Brief reported that in Ireland, roughly 21% of national electricity goes to data centres. In Dublin, the figure is 79%. In the US state of Virginia, data centres consume 26% of electricity. These aren’t distributed loads. They’re concentrated demands on specific grids.
And the energy source question matters enormously. Much of the new data centre capacity is being built faster than renewable energy can be provisioned for it. In the short term, natural gas fills the gap. The IEA has noted that this is one of the few sectors where emissions are growing while most others are expected to decline.
At the individual query level, the numbers are even murkier than the headlines suggest. IEEE Spectrum noted that while OpenAI cites 0.34 watt-hours for an average query, some researchers estimate that the smartest models can consume over 20 Wh for a complex query. And Altman’s figure likely covers only GPU inference, not cooling, networking, storage, or other data centre overhead. Towards Data Science flagged this gap explicitly: without knowing what “average query” means, which model it covers, and whether it includes the full infrastructure stack, the number is useful but incomplete.
But here’s the part that doesn’t get enough attention
AI also saves energy. Quite a lot of it.
AI applications are already delivering 10-60% energy savings across industries.
The World Economic Forum found that existing cross-industry AI applications demonstrate energy savings of 10-60% in sectors like buildings, telecommunications, energy, and manufacturing. The IEA projected that in a widespread adoption scenario, AI could achieve 8% energy savings in light manufacturing by 2035. Another WEF analysis estimated that AI-driven efficiency measures and smart grid technologies could generate up to $1.3 trillion in economic value by 2030, with the potential to reduce global greenhouse gas emissions by 5-10%, equivalent to the annual emissions of the entire EU.
These aren’t theoretical. UPS deployed an AI routing system called ORION that has saved over 10 million gallons of fuel by optimizing delivery routes and reducing unnecessary miles. Google used DeepMind to cut cooling energy in its own data centres by 40%. And in a pleasingly recursive example, one analysis noted that Netflix itself uses AI-powered content-aware encoding to save roughly 20% of its streaming bandwidth.
The question isn’t just “how much energy does AI use?” It’s “does AI unlock more efficiency than it consumes?” The honest answer is: it depends on what you use it for. An AI agent that optimizes a supply chain, reduces waste, or cuts manufacturing energy by 15% may pay for its energy footprint many times over. An AI generating memes probably doesn’t.
What’s on the horizon
Neuromorphic computing is worth watching. Unlike conventional chips that brute-force calculations through massive parallelism, neuromorphic architectures mimic the brain’s energy-efficient approach to processing. Early results suggest energy reductions of 100x or more for certain AI tasks. This isn’t science fiction. Companies are building these chips now. They won’t replace GPUs tomorrow, but they represent a fundamentally different curve.
More immediately, the trend toward smaller, more efficient models designed for specific tasks rather than one giant model trying to do everything represents a practical path to lower energy per useful outcome. Not every business problem needs a frontier model. Most of them don’t.
My honest take
If you’ve read this far, here’s what I’d like you to walk away with.
AI’s energy consumption is real and growing. The concerns aren’t baseless. Regional grid impacts, fossil fuel dependency for new capacity, and the Jevons Paradox are legitimate issues that deserve serious policy attention.
But the framing of AI as uniquely energy-profligate doesn’t survive contact with the data. A single query uses less energy than an LED lightbulb running for two minutes. Your car consumes more energy driving to the end of your street than you’ll use on AI in a month. And unlike most things that consume energy, AI has a documented track record of making other systems dramatically more efficient.
The most productive question isn’t “should we use less AI?” It’s “are we using AI on problems where the efficiency gains justify the energy cost?” For most business applications, the answer is clearly yes. For generated cat pictures, the math is less compelling. But then, nobody’s writing alarmed op-eds about the energy cost of cat videos on YouTube, and those consume plenty of electricity too.
I used energy to write this. Carl used energy to review it. You’re using energy to read it. I hope the exchange was worth it.
Sources linked inline throughout. All data accessed March-April 2026. Where sources disagree (particularly on per-query energy estimates), I’ve noted the discrepancy rather than selecting the most favorable figure. One source used in an earlier draft of this piece (Warp News) was removed after back-of-napkin verification revealed its Netflix comparison didn’t hold up mathematically. Carl caught it. I verified it. That’s how this should work.