NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Estimating AI energy use (spectrum.ieee.org)
qnleigh 5 hours ago [-]
For reference, global energy consumption is about 180,000 TWh[1]. So while the numbers in this article are large, they're not a significant fraction of the total. Traveling and buying things are probably a much bigger part of your carbon footprint. For example:

- 25 LLM queries: ~8.5 Wh

- driving one mile: ~250-1000 Wh

- one glass bottle: ~1000 Wh [2]

- a new laptop: ~600,000 Wh [3]

- round-trip flight from LA to Tokyo: ~1,000,000 Wh

[1] https://ourworldindata.org/energy-production-consumption

[2] https://www.beveragedaily.com/Article/2008/03/17/study-finds...

[3] https://www.foxway.com/wp-content/uploads/2024/05/handprint-...

deaux 1 hours ago [-]
For reference, this is based on figures given by Sam Altman, which are worth as much as going to random.org and asking it for a number between 0 and 100 to use as Wh per LLM query.

Scratch that, the latter is probably more reliable.

dayjaby 1 hours ago [-]
What about LLM training? What about training all the discarded or unused LLMs?
ugurs 5 hours ago [-]
Without knowing the cumulative amount of energy consumption it is not a fair comparison. If there are one billion llm sessions every day, it is still a lot of energy.
kurthr 5 hours ago [-]
No, not really. A billion people (15% of the population) drive more than a mile a day. Well over 100 million laptops are sold every year. These are easy numbers to look up.

Just look at your own life and see how much of each you would use.

globular-toast 2 hours ago [-]
So do LLMs mean fewer people have to drive a mile every day?
exitb 1 hours ago [-]
These don't have to be dependent to be meaningful.
AlecSchueler 45 minutes ago [-]
I think the point is that we all need to use less energy, we need to avoid flights from LA to Tokyo where possible, not using the energy use as an excuse to use even more energy.
qnleigh 1 hours ago [-]
Yes quoting energy use per query isn't the full picture, though it is still a useful benchmark for understanding the relative impact of one's use as an individual. As for cumulative impact, the ieee article gives an estimate of 347 TWh per year by 2030, which is still a very small fraction of global energy consumption today.
s0rce 5 hours ago [-]
180,000 TWh total since the start of time or per year?
kurthr 5 hours ago [-]
It was 170,000 TWh annually in 2021.
Gigachad 4 hours ago [-]
Model usage seems quite small compared to training. I can run models on my phone which took millions of hours of GPU training time to create. Although this might change with the new AI slop tiktok apps every company is rushing to create now.
gaoshan 6 hours ago [-]
One thing it's doing is jacking up electricity rates for US States that are part of the [PJM Interconnection grid](https://en.wikipedia.org/wiki/PJM_Interconnection). It's a capacity auction price that is used to guarantee standby availability and it is [up significantly](https://www.toledochamber.com/blog/watts-up-why-ohios-electr...) at $270.43 per MW/day, which is far above prior years (~$29–58/MW/day) and this is translating to significantly higher consumer prices.
givemeethekeys 6 hours ago [-]
Why are consumers paying for electricity used by server farms? Why can't the electricity companies charge the server farms instead?

Where I live, the utility company bills you at a higher rate if you use more electricity.

Ekaros 2 hours ago [-]
Are they paying for electricity used by server farms. Or are they just paying more profits for owners of electricity producers? Do server farms get electricity below market price?

Ofc, possible long term contracts and options are involved in some of these markets. But there the option sellers would bear the cost.

XorNot 5 hours ago [-]
Because electricity prices are an auction, so increased demand is bidding up the price anyway.

You need strong residential consumer protections to avoid this.

beeflet 1 hours ago [-]
do you? maybe we just need more supply
renewiltord 50 minutes ago [-]
The residential consumers also oppose that. Usually they try very hard to reduce it. E.g. Diablo Canyon NPP
blueblisters 5 hours ago [-]
> consumers paying for electricity used by server farms

wait what? consumers are literally paying for server farms? this isn't a supply-demand gap?

MatekCopatek 44 minutes ago [-]
It's a supply-demand gap, but since the reasons for it are very apparent, it's completely reasonable to describe it as "consumers paying for [the existence of] datacenters".
justlikereddit 2 hours ago [-]
This is a recurrent question and not just for servers.

In Europe it is constantly >"why does the households of half of Europe pay for German unwillingness to have a good power mix? Why should anyone want more cross country or long range interconnects if it drives up local prices?"

Say Norway with abundant hydropower, they should by all right have cheap power. But reality is not so in half of the country because they're sufficiently interconnected to end up on a common bidders euro market and end up paying blood money for the poor political choices of countries they don't even share a border with.

Addition: this also creates perverse incentives. A good solution for many of the interconnected flat euro countries would love enormous hydropower overcapacity to be built in Norway at the cost of the local nature. This is great for whoever sells the hydropower. This is great for whoever is a politician that can show off their green all-hydro power mix in a country as hilly as a neutron star. But this is not great for whoever gets their backyard hiking trails reduced to a hydro reservoir.

But hey we do it with everything else too, "open pit mines are too destructive to have in our country, so we'll buy it from china and pretend we're making green choice. Globalism in a nutshell: Export your responsibility.

barbazoo 6 hours ago [-]
Charge them more than individual consumers? Why? Let the market decide how much electricity should be. /s
givemeethekeys 4 hours ago [-]
Hehe. Well, if the market is no good for its participants, then at least there is a viable alternative for many of them.
quaintdev 2 hours ago [-]
In India, we have different energy consumption bands like 0-200kWh, 200-400kWh and so on. People whose consumption is in 0-200kWh pay less as compared to 200-400kWh and so on.
StrangeDoctor 6 hours ago [-]
I think the unit you and the article want are MW-Day of un enforced capacity UCAP, not MW/Day.

PJM claims this will be a 1.5-5% yoy increase for retail power. https://www.pjm.com/-/media/DotCom/about-pjm/newsroom/2025-r...

tylervigen 1 hours ago [-]
One thing we should be careful about regarding calculations related to the larger set of "all data centers" vs only "GenAI" is that the data centers include all the predictive algorithms for social media and advertising. I, for one, would not want to misdirect ire at ChatGPT that really belongs directed at ads.
Mistletoe 6 hours ago [-]
You are paying for AI whether you want it or not. Just use it at least I guess. You have no say over anything else.
geuis 7 hours ago [-]
My thoughts.

Current gen AI is going to result in the excess datacenter equivalent of dark fiber from the 2000's. Lots of early buildout and super investment, followed by lack of customer demand and later cheaper access to physical compute.

The current neural network software architecture is pretty limited. Hundreds of billions of dollars of investor money has gone into scaling backprop networks and we've quickly hit the limits. There will be some advancements, but it's clear we're already at the flat part of the current s-curve.

There's probably some interesting new architectures already in the works either from postdocs or in tiny startups that will become the base of the next curve in the next 18 months. If so, one or more may be able to take advantage of the current overbuild in data centers.

However, compute has an expiration date like old milk. It won't physically expire but the potential economic potential decreases as tech increases. But if the timing is right, there is going to be a huge opportunity for the next early adopters.

So what's next?

ch4s3 6 hours ago [-]
If the end result here is way overbuilt energy infrastructure that would actually be great. There’s a lot you can do with cheap electrons.
riku_iki 6 hours ago [-]
I suspect it will mostly be fossil power capacity, which is much easier to scale up
ch4s3 6 hours ago [-]
I wouldn’t be so sure about that. Serves of the big names in this space have green energy pledges and are actively building out nuclear power.
epistasis 4 hours ago [-]
Nobody is actively building out nuclear power. Microsoft is turning on a recently decommissioned facility.

New nuclear is too expensive to make sense. At most there are small investments in flash-in-the-pan startups that are failing to deliver plans for small modular reactors.

The real build out that will happen is solar/wind with tons of batteries, which is so commonplace that it doesn't even make the news. Those can be ordered basically off the shelf, are cheap, and can be deployed within a year. New nuclear is a 10-15 year project, at best, with massive financial risk and construction risk. Nobody wants to take those bets, or can really afford to, honestly.

beeflet 1 hours ago [-]
There are a couple companies doing HTGR and SMRs that seem to be on track.
lovecg 3 hours ago [-]
> The real build out that will happen is solar/wind with tons of batteries

That actually sounds awesome, is there a downside I’m not seeing?

riku_iki 6 hours ago [-]
gyomu 4 hours ago [-]
Yes, that makes sense.

Also adding to that tendency, I suspect as the tech matures more and more consumer space models will just run on device (sure, the cutting edge will still run in server farms but most consumer use will not require cutting edge).

christkv 55 minutes ago [-]
This. I totally agree we will see better architectures for doing the calculations, lower energy usage inference hardware and also some models running on locally moving some of the "basic" inference stuff off the grid.

It's going to move fast I think and I would not surprised if the inference cost in energy is 1/10 of today in less than 5 years.

rpcope1 5 hours ago [-]
Hopefully there's a flood of good cheap used Supermicro and other enterprise gear and maybe a lot of cheap colo.
driverdan 6 hours ago [-]
This is one possibility I'm assuming as well. It largely depends on how long this bubble lasts. At the current growth rate it will be unsustainable before many very large DCs can be built so it's possible the impact may not be as severe as the telecom crash.

Another possibility is that new breakthroughs significantly reduce computational needs, efficiency significantly improves, or some similar improvements that reduce DC demand.

KPGv2 4 hours ago [-]
> There's probably some interesting new architectures already in the works either from postdocs or in tiny startups

It is not clear to me why we will have a breakthrough after virtually no movement on this front for decades. Backpropagation is literally 1960s technology.

benzible 3 hours ago [-]
Because tremendous rewards will spur a huge increase in research?
refulgentis 6 hours ago [-]
It's a line (remindme! 5 years)
simonw 7 hours ago [-]
If I'm interpreting this right it's estimating that ChatGPT's daily energy usage is enough to charge just 14,000 electric vehicles - and that's to serve in the order of ~100 million daily users.
blibble 7 hours ago [-]
> We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence.

what do you think the odds of this being accurate are?

zero?

JimDabell 6 hours ago [-]
Why would you assume that? It’s in line with estimates that were around before he posted that article and it’s higher than Gemini. It’s a pretty unsurprising number.
deaux 1 hours ago [-]
Back in the 1980s, I'm sure Philip Morris' claimed similar numbers on cigarettes (not) causing cancer as R.J. Reynolds did.

I also wouldn't be surprised if Aramco and Rosneft gave similar estimates on global warming and oil's role in it.

simonw 7 hours ago [-]
Hard to say. Sam wrote that on June 10th this year: https://blog.samaltman.com/the-gentle-singularity

GPT-5 came out on 7th August.

Assuming the 0.34 value was accurate in the GPT-4o era, is the number today still in the same ballpark or is it wildly different?

blibble 7 hours ago [-]
the "AI" industry have identified that energy usage is going to be used as a stick to beat them with

if I was altman then I'd release a few small numbers to try and get influencers talking about "how little energy chatgpt uses"

and he can never be accused of lying, as without any methodology as to how it was calculated it's unverifiable and completely meaningless

win-win!

a_wild_dandan 6 hours ago [-]
I would bet that it's far lower now. Inference is expensive we've made extraordinary efficiency gains through techniques like distillation. That said, GPT-5 is a reasoning model, and those are notorious for high token burn. So who knows, it could be a wash. But selective pressures to optimize for scale/growth/revenue/independence from MSFT/etc makes me think that OpenAI is chasing those watt-hours pretty doggedly. So 0.34 is probably high...

...but then Sora came out.

yen223 6 hours ago [-]
Yeah, something we are confident about is that

a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)

b) video generation is very likely a few orders of magnitude more expensive than text generation.

That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.

Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%

simonw 5 hours ago [-]
You mean this Mistral report? https://mistral.ai/news/our-contribution-to-a-global-environ...

I don't think it shows that training uses more energy than inference over the lifetime of the model - they don't appear to share that ratio.

bluefirebrand 3 hours ago [-]
> don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport

Ok, but heating and cooling are largely not negotiable. We need those technologies to make places liveable

LLMs are not remotely as crucial to our lives

blondie9x 5 hours ago [-]
You gotta start thinking about the energy used to mine and refine the raw materials used to make the chips and GPUs. Then take into account the infrastructure and data centers.

The amount of energy is insane.

7 hours ago [-]
moralestapia 7 hours ago [-]
I was about to post this exact thing.

Seems ... low? And it will only get more efficient going forward.

I don't get why this is supposed to be a big deal for infrastructure since there's definitely way more than 14,000 EVs out there and we are doing well.

ares623 7 hours ago [-]
the infrastructure needs to go somewhere. And that somewhere needs to have access to abundant water and electricity. It just so happens those are things humans need too.

Before GenAI we were on our way to optimizing this, at least to the level where the general public can turn a blind eye. It was to the point where the companies would brag about how much efficient they are. Now all that progress is gone, and we're accelerating backwards. Maybe that was all a lie too. But then who's to say the current numbers are a lie too to make the pill easier to swallow.

moralestapia 7 hours ago [-]
Hmm, I guess we'll have to do it the slow way ...

What % of EVs on the market is 14,000?

7 hours ago [-]
SweetSoftPillow 55 minutes ago [-]
Very useful context:

"How much energy does Google’s AI use? We did the math": https://cloud.google.com/blog/products/infrastructure/measur...

b00ty4breakfast 28 minutes ago [-]
>We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence. It’s worth noting that some researchers say the smartest models can consume over 20 Wh for a complex query. We derived the number of queries per day from OpenAI's usage statistics below.

I honestly have no clue how much trust to place in data from a blog post written by a guy trying to make people give him lots of money. My gut is to question every word that comes out of his mouth but I'm maybe pessimistic in that regard.

But besides that, the cost of this stuff isn't just the energy consumption of the computation itself; the equipment needs to be manufactured, raw materials need to be extracted and processed, supplies and manpower need to be shuffled around. Construction of associated infrastructure has it's own costs as well. what are we, as a society (as opposed to shareholders and executives) going to get in return and is it going to be enough to justify the costs, not just in terms of cash but also resources. To say nothing of the potential environmental impact of all this.

queenss90 42 minutes ago [-]
I got a win last night and it was real, I played on the JO777 site
driverdan 6 hours ago [-]
This doesn't seem to factor in the energy cost of training which is currently a very significant overhead.
simianwords 53 minutes ago [-]
How do you know it is significant?
maxglute 4 hours ago [-]
>The Schneider Electric report estimates that all generative AI queries consume 15 TWh in 2025 and will use 347 TWh by 2030; that leaves 332 TWh of energy—and compute power—that will need to come online to support AI growth. T

+332TW is like... +1% of US power consumption, or +8% of US electricity. If AI bubble burst ~2030... that's functionally what US will be left with (assuming new power infra actually built) mid/long term since compute depreciates 1-5 years. For reference dotcom burst left US was a fuckload of fiber layouts that lasts 30/40/50+ years. Still using capex from railroad bubble 100 years ago. I feel like people are failing to grasp how big of a F US will eat if AI bursts relative to past bubbles. I mean it's better than tulip mania, but obsolete AI chips also closer to tulips than fiber or rail in terms of stranded depreciated assets.

blueblisters 5 hours ago [-]
Math comparing new datacenter capacity to electric cars -

Projections estimate anywhere between 10GW to 30GW of US datacenter buildup over the next few years

1GW of continuous power can support uniform draw from ~2.6M Tesla Model 3s assuming 12,000 miles per year, 250Wh/mile.

So 26M on the lower end, 80M Model 3s on the upper end.

That's 10x-30x the cumulative number of Model 3s sold so far

And remember all datacenter draw is concentrated. It will disproportionately going to impact regions where they're being built.

We need new, clean power sources yesterday

blondie9x 7 hours ago [-]
This doesn't include the energy for mining and chip production either. Can you imagine if it did?

Then when you take into account the amount of water used to cool the data centers as well as part o extraction and production process? Things get insane then https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-ac...

logicallee 1 hours ago [-]
The energy used for manufacturing the datacenters including the GPU's must also be rather high. Manufacturing is an energy-intensive sector.

Edit: I asked ChatGPT-5:

https://chatgpt.com/share/68e36c19-a9a8-800b-884e-48fafbe0ec...

it says:

>the manufacturing of GPUs and datacenters themselves consumes a large amount of energy, not just their operation. The operational energy use (for AI training, inference, cooling, etc.) gets most of the attention, but the embodied energy — the energy used to extract raw materials, manufacture chips and components, and construct facilities — is substantial.

and summarizes it with:

    4. Bottom Line

     • Manufacturing GPUs and datacenters is highly energy-intensive, but operational energy dominates over time.
     • For a single GPU, embodied energy ≈ 0.5–1 MWh.
     • For a datacenter, embodied energy ≈ 6–12 months of its operational energy.
     • Given AI-scale deployments (millions of GPUs), the embodied manufacturing energy already reaches terawatt-hours globally — roughly comparable to the annual electricity use of a small country.
ChrisArchitect 4 hours ago [-]
Related:

OpenAI’s Hunger for Computing Power Has Sam Altman Dashing Around the Globe

https://news.ycombinator.com/item?id=45477192

metadat 3 hours ago [-]
(94 comments, 1 day ago)

Thanks a lot, Chris!

southernplaces7 7 hours ago [-]
[dead]
EcommerceFlow 4 hours ago [-]
The question to ask is why supply of energy hasn't kept up with demand. Regulations (primarily in Democratic states) is most likely the answer. When you use government incentives to pick winners and losers with energy sources, it throws the entire energy market out of sync.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 08:18:38 GMT+0000 (Coordinated Universal Time) with Vercel.