> Tech companies like Meta, Amazon, and Google have responded to this fossil fuel issue by announcing goals to use more nuclear power. Those three have joined a pledge to triple the world’s nuclear capacity by 2025.
Erm ... that's a weird date considering this article came out yesterday. They actually pledge to triple the world's nuclear capacity by 2050[1]
There are a couple of weird things like that in this article, including the classic reference to "experts" for some of its data points. Still ... at least somebody's trying to quantify this.
The weirdest thing in the article is the refusal of Big Tech to release this data. We wouldn't experts to guess. Society must have information to make decisions about what affects all the world.
WillPostForFood 35 minutes ago [-]
Do you think it is weird or just a typo?
troyvit 13 minutes ago [-]
Heh, when I first started typing it I thought it was weird, then I discovered it was a typo, but didn't edit my first sentence. Weird huh?
bogtog 5 hours ago [-]
> The largest model we tested has 405 billion parameters, but others, such as DeepSeek, have gone much further, with over 600 billion parameters.
Very quickly skimming, I have some trouble taking this post seriously when it omits that the larger DeepSeek one is a mixture-of-experts that will only use 12.5% (iirc) of its components for each token.
The best summary of text energy use I've seen is this (seemingly more rigorous, although its estimates are consistent with the final numbers made by the present post): epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
Estimates for a given response widely for a "typical" query (0.3 Wh; 1080 joules) and a maximal-context query (40 Wh; 144k joules). Assuming most uses don't come close to maximizing the context, the energy use of text seems very small compared to the benefits. That being said, the energy use for video generation seems substantial
I would be interested in seeing the numbers corresponding to how LLMs are typically used for code generation
SoftTalker 19 hours ago [-]
If you are old enough you remember posting to Usenet and the warning that would accompany each new submission:
This program posts news to thousands of machines throughout the entire civilized world. Your message will cost the net hundreds if not thousands of dollars to send everywhere. Please be sure you know what you are doing. Are you absolutely sure that you want to do this? [ny]
Maybe we meed something similar in LLM clients. Could be phrased in terms of how many pounds of atmospheric carbon the request will produce.
mirekrusin 15 minutes ago [-]
Or gigantic footer explaining you should consider impact before trying to print this email – which was always eating next page when printed.
smusamashah 6 hours ago [-]
Your quote is actually telling the opposite of your suggestion.
So they used to send this message, but then it stopped I assume. Costs lowered a lot or the benefits outweighed all associated costs. Same can happen here.
FollowingTheDao 1 hours ago [-]
> Costs lowered a lot or the benefits outweighed all associated costs.
How is this even quantifiable?
How about this. Before using AI to make fake images and help kids cheat on their homework, we take it offline and use it to solve it's own problem of energy use.
You know what this does not happen? Because the goal is profit and the profit comes not from solving real important problem, but by making people think it is helping them solve made up problems.
yongjik 18 hours ago [-]
A lot of us live in a country where "rolling coal" is a thing. I fear your prompt may have an opposite of the intended effect.
nightski 1 hours ago [-]
How does it make you feel that your computing carbon footprint may be higher than rolling coal? Are we the problem?
rcpt 51 minutes ago [-]
How did you get those numbers? Based on the article and my searching around
Llama 3.1: ~0.7 to 1.5 grams of CO2
Rolling coal event: ~10,000 to 100,000+ grams of CO2
FollowingTheDao 1 hours ago [-]
> Are we the problem?
Yes.
UnreachableCode 9 hours ago [-]
You say that as if rolling coal people are capable of using or understanding LLMs
runjake 2 hours ago [-]
I personally know of at least 5 “rolling coal” people (aka “rednecks) that use ChatGPT on the regular.
You underestimate the pervasiveness of AI, and in particular, ChatGPT. It is quite popular in the blue collar trades.
And yeah, a lot of them probably regard everything that ChatGPT tells them as fact.
neves 33 minutes ago [-]
I needed to ask ChatGPT to understand you :-)
"Rolling coal" is the practice of modifying a diesel engine—usually in pickup trucks—to increase the amount of fuel entering the engine, which causes the vehicle to emit large, thick plumes of black smoke from the exhaust. This is often done by tampering with or removing emissions control devices.
lvturner 9 hours ago [-]
While I might question their sanity and/or ethics, it's generally not a good idea to underestimate a fool.
GuB-42 5 hours ago [-]
Not sure about understanding, but anyone can use a LLM. That is the most intuitive way to interact with a computer and that's the entire point. It may even work on animals. There is serious research on how LLMs could interpret animal language, like with dolphins.
FollowingTheDao 60 minutes ago [-]
> It may even work on animals. There is serious research on how LLMs could interpret animal language, like with dolphins.
This is one of the more hysterical things I have heard.
How would we even know it was translating animal language correctly?
The need for ever-expanding profit ensures that they will be addressed as a market. Give it six months.
wat10000 4 hours ago [-]
You think they can operate a motor vehicle but don’t know how to type into a text box on chatgpt dot com?
cess11 3 hours ago [-]
The dangers of that web site are much more subtle and hard to defend against than those associated with "a motor vehicle", by which I guess you mean something like a car.
You can see traffic. It's easy to understand the dangers in a collision because when you drive into something unexpectedly your body takes a hit and you get frightened since you immediately realise that it might cost you a lot of money but you don't know for sure.
Being subtly manipulated by a disgustingly subservient fake conversationalist is another thing altogether.
bravetraveler 8 hours ago [-]
You say that as if LLMs aren't another dumbing-down interface.
6 hours ago [-]
01HNNWZ0MV43FF 18 hours ago [-]
Taxing anything that can pollute (methane, gasoline, diesel) would let The Hand sort it out
freeone3000 17 hours ago [-]
Carbon taxes are incredibly unpopular, because it makes easy and convenient things expensive.
triceratops 16 hours ago [-]
They're incredibly unpopular because even when they're made revenue-neutral (meaning, everyone gets a refund check) people don't realize most of them would make money if they actually reduced their carbon.
KennyBlanken 15 hours ago [-]
They're incredibly unpopular because the ultraweathly use massive amounts of fossil fuels and thus lobby very, very hard against them...and make sure the public is often told just how evil they are and how expensive they'd hurt Johnny Everyday Worker, even car ownership, especially in a city (where much of the US populative lives) is not affordable to a large segment of the population.
If memory serves Jet A is not taxed at all federally in the case of for-profit corporations (while non-commercial users DO pay a tax!) and many states also either do not tax it or tax it very litte.
It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
navane 7 hours ago [-]
Usage tax is impopulaire because usage taxes are regressive.
Joe driving to work spends a larger fraction of his income on fuel and thus fuel tax than his rich counterpart.
This is true for all "let the consumer/poluter pay" taxes, they're all regressive. They say: it's fine to burn up the world as long as you're rich.
WillDaSilva 5 hours ago [-]
Assuming the tax is high enough (or grows over time to become high enough) to offset the negative externalities, and that the money raised is used to offset negative externalities, they're better phrase not as "it's fine to burn up the world as long as you're rich", but rather as "it's fine to emit CO2 as long as you sufficiently offset the damage". Accounting for the damage could involve investments into green technologies, or paying ordinary people to make the tax popular, among other things.
Personally I like the idea of setting the price for emitting 1 ton of CO2 equivalent emissions to the realistic cost of capturing 1 ton of CO2. At least, that seems like a reasonable end goal for a carbon tax, since that could fully account for the negative externality. This would of course be obscenely expensive, which would be a strong incentive to lower carbon emissions where possible, and for people to consume less of products that require large emissions to make or use.
The carbon tax would also have to apply to imported goods to be effective, and tracking how much tax should apply to imports would be even more difficult than doing so for domestic sources of pollution.
2 hours ago [-]
ericd 3 hours ago [-]
I don’t think they are regressive, if you make it revenue neutral, because carbon footprint is heavily correlated with spending. Everything you buy has a good amount of embodied carbon. Revenue neutral is actually redistributive.
lynx97 5 hours ago [-]
Believe me, the common man doesn't need the ultrawealthy to dislike arbitrary cost increases. Carbon taxes are incredibly unpopular. As is common sense and/or planning for a future 20 years ahead. Humans are, on average, selfish beings. All this climate change activism is NOT the norm, and does NOT resonate with most common people. Talk to some outside of your activist bubble, and you will learn a thing or two about humans.
triceratops 3 hours ago [-]
> dislike arbitrary cost increases
Please see my comment again. Under a revenue-neutral carbon tax everyone gets money back. But they don't realize it. Costs only go up for people who emit more carbon than average.
That's quite condescending btw. Is it "activism" to try to avert a calamity that will increase the cost of living by a lot more 20 years from now? I think it's good fiscal sense. Long-term thinking and planning. Y'know adult shit.
> Humans are, on average, selfish beings
And easily swayed by stupid arguments. Exhibit B: Canada's recent repudiation of the carbon tax because fossil fuel industry propaganda convinced everyone that the tax was the cause of price increases. Now prices will stay the same (because the market will bear them) but no one will get any rebate money.
rcpt 47 minutes ago [-]
Sales taxes are hugely popular and nobody gets a check in the mail from that
triceratops 41 minutes ago [-]
I don't know if you can call any tax "hugely popular". They're at best "grudgingly tolerated".
HPsquared 6 hours ago [-]
Regular people spend a lot more on fuel, as a fraction of their wages, than rich people. This applies especially in poor countries (also to food).
triceratops 3 hours ago [-]
A revenue-neutral carbon tax redistributes the money collected from the tax equally. Poor people get back much more, as a fraction of their wages, than rich people.
2 hours ago [-]
AnthonyMouse 14 hours ago [-]
> They're incredibly unpopular because the ultraweathly use massive amounts of fossil fuels and thus lobby very, very hard against them...and make sure the public is often told just how evil they are and how expensive they'd hurt Johnny Everyday Worker, even car ownership, especially in a city (where much of the US populative lives) is not affordable to a large segment of the population.
Eh. It's not Bill Gates and Alice Walton. Sometimes the obvious answer is the real one: It's the fossil fuel industry.
> It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
That one's just the arbitrage problem. Planes move around. If there is an international flight to a country that doesn't tax jet fuel (or taxes it less) then the plane is going to fly into LAX with enough fuel still in the tank to get back to the other jurisdiction and fill up again. Which actually increases fuel consumption because fuel is heavy and they otherwise wouldn't want to do that.
This is the same reason the EU doesn't tax jet fuel.
triceratops 3 hours ago [-]
> the plane is going to fly into LAX with enough fuel still in the tank to get back
Any reason that can't be treated as a fuel import and taxed accordingly? I understand current laws may not allow it but is that legislation impossible to write?
teekert 7 hours ago [-]
And will probably not affect Taylor Swift in the slightest.
triceratops 51 minutes ago [-]
If Taylor Swift's private jet generates a few dollars more in rebates for everyone else, fine.
ZeroGravitas 5 hours ago [-]
The irony is that carbon taxes don't really affect anyone that much.
Even flying would only cost about 10% more for example. And most other activities have carbon free alternatives they can shift to rather than just eat the cost. Which is kind of the point.
ZeroGravitas 5 hours ago [-]
They're not actually unpopular.
They've been implemented all over the world, because they're effective. They cover 31% of emissions in developed nations.
To whatever degree you could say they are unpopular, they're unpopular in regions where the government doing stuff about climate change (or just "the government doing stuff") is unpopular, which makes it odd to single out putting a price on carbon specifically
Are solar panels convenient? All polysilicon today is made with fossil fuels, and the R&D to make it with renewable energy is still in-progress. Not to mention that we ship them across the ocean with fossil fuel.
HPsquared 6 hours ago [-]
The main process for polysilicon manufacturing (Siemens process) uses electricity. That doesn't need fossil fuels.
wat10000 4 hours ago [-]
You mean, they expose the true cost of those things and make the user pay them. They’re already expensive, the cost is just diffused. That’s the whole problem.
FollowingTheDao 57 minutes ago [-]
No, they are unpopular because they are a regressive tax effect the poor, which are now the majority of the U.S. population.
17 hours ago [-]
16 hours ago [-]
blkhawk 18 hours ago [-]
I find that message very curious because the message itself clearly does not cost much but the machines it is send on do. So the more messages that are send the less the messages will cost.
anon7000 8 hours ago [-]
Well, Usenet is 45 years old, and the internet was not nearly as cheap and ubiquitous then
wmf 19 hours ago [-]
Only if that same warning is attached to literally everything else you do. It's unfair to single out AI for using energy.
SoftTalker 19 hours ago [-]
But "asking ChatGPT" is something people do so casually without there being any apparent clues of the costs associated with that. I guess that is true of pretty much everything online though.
Even driving your car around you at least are somewhat aware of the gas you are burning.
wmf 18 hours ago [-]
"Civilization advances by extending the number of important operations which we can perform without thinking of them." — Alfred North Whitehead
Emissions should be fixed on the production side (decarbonization) not on the demand side (guilt/austerity).
mulmen 15 hours ago [-]
While I agree in principle how does this work for fossil fuels? Is the idea that we should make extraction prohibitively expensive?
Scaling up battery production makes EVs more appealing on the demand side. How do you disincentivize fossil fuel production?
wmf 14 hours ago [-]
Carbon tax or something similar.
AnthonyMouse 14 hours ago [-]
Batteries and EVs are the production side. Reducing demand is e.g. requiring you to drive fewer miles. You get a car that doesn't run on petroleum and CO2 goes down while vehicle miles can stay the same or go up.
gbear605 14 hours ago [-]
Making it illegal is always an option, and one that many countries are considering
wat10000 3 hours ago [-]
We should make extraction expensive enough to capture the externalities.
The problem with fossil fuels isn’t that they pollute, but that most of the negative impact of that pollution is borne by others. This results in an artificially low price which distorts the market and results in economically inefficient overuse.
Capture that cost by making producers pay a tax of the corresponding amount, and market forces will use the “right” amount of fossil fuels in ways that are a net benefit.
appreciatorBus 19 hours ago [-]
I would bet most ppl drive around with very little awareness of how much it’s costing, either in money or environmental impact. Many people I’ve met seem to measure efficiency by how much it costs to fill up the tank.
bilbo0s 17 hours ago [-]
Even more fundamentally, think about how much carbon running a cup of water from your faucet produces. No matter where you live, this is more carbon than an LLM prompt generates.
Or, even worse God forbid, think about how much carbon is produced to create a single bottle or carton of water. Then consider how casually people down bottles of water.
4ndrewl 8 hours ago [-]
The utility of having clean water is arguably higher than that of being able to create a steampunk-style of a cat in a suit.
blululu 16 hours ago [-]
Can you source that? A lot of places have gravity fed reservoirs that are energy positive/neutral (LA, San Francisco and New York all rely on gravity fed reservoirs that were built before energy intensive pumping was practical). There are some costs but they are pretty small per gallon.
SilasX 13 hours ago [-]
I’m assuming that calculation is amortizing the cost of running the water system (including wastewater treatment) adding the cost pumping it to the point that gravity can push it through the pipes. It’s never free.
Piskvorrr 8 hours ago [-]
There seems to be a vast gap between "free" and "more expensive than running LLMs". Also, the water seems...more necessary. Going without LLMs for three days will not threaten your life.
cmcconomy 3 hours ago [-]
we should shut off clean water and shunt the energy toward more text generation
asdff 19 hours ago [-]
That could be solved by charging more for the service. That is the only reason you are aware of the gas burning after all, you aren't conducting your own aq test you are noticing you are filling up twice a week at $50 a tank.
phillipcarter 18 hours ago [-]
They're aware of the price they pay for the gas, not the emissions. I would wager that the mass ignorance of the impact of fossil fuels (and rubber on roads) that the broader population has is a significant reason why we're in this climate mess today.
jbm 18 hours ago [-]
> rubber on roads
Funny how this suddenly became a thing after electrification became a thing. Need to find a new way to wag the finger after all.
phillipcarter 17 hours ago [-]
It's always been a thing? I'm pro-electrification, BTW.
jbm 16 hours ago [-]
I am 100% on the side of reducing pollutants — but this was never publicly seen as a major issue and I'm suspicious about the timing.
The oil industry is a conglomerate of degenerates spamming boomer logic all the way down to the workers. Their memes propagate throughout society and lead to the other boomer characteristic of rewriting personal and societal history.
The finger waggers now are being programmed to pretend they talked about tire particulates and the carheads are being programmed to pretend they never cared about 0-60. This another "We have always been at war with Eastasia", just like they all opposed the Iraq war from day 1 and didn't cancel the Dixie Chicks, et cetra.
This may have been discussed in specialist literature somewhere but even when I did ecology courses in university circa 2001ish, I never heard about tire particulates, while I did hear a lot about greenhouse gasses.
It's a concern but not a civilization ending concern like climate change. I low key resent these attempts to move goalposts to satisfy the writer's urge for negativity.
AnthonyMouse 14 hours ago [-]
It's pretty clearly a talking point.
Consider that a bus has six to ten tires that each weigh around ten times more than a typical car tire. This is presented as the alternative to cars, is it even any different? Not implausible that it could actually be worse, especially if the bus isn't at full occupancy at all times.
Meanwhile the weight difference between EVs and petroleum cars is emphasized in the complaints, even though it isn't very large, while the much larger weight difference between any cars and buses is ignored. Because the point isn't to complain about tires, it's to complain about EVs.
And if the point actually was to complain about tires then you still wouldn't be talking about EVs, you would be talking about tires and how to make them shed less or construct them out of lower toxicity materials etc.
phillipcarter 3 hours ago [-]
The city bus comparison is uneven, but if we consider peak travel times during the week, the density intuitively seems like it works out to less waste. City buses have their numbers and schedule dialed back when you're not in peak hours, and I suspect that it's peak hours where you see the bulk of waste from tires.
My city buses in peak travel hours have anywhere from 20 to 75 people on them. Even if we assume that every one of those folks would have carpooled (which rarely happens), we're looking at a lot of cars, and thus tires, on the road.
hansvm 13 hours ago [-]
Last time I did the math, a Tesla Model Y only had 3x less tire emissions than a semi truck per distance traveled. City buses are on-par with a Tesla Model Y if you only care about mL/km tire wear.
AnthonyMouse 13 hours ago [-]
How is that math supposed to work when a city bus weighs almost ten times as much and has more and bigger tires?
hansvm 12 hours ago [-]
The city bus uses tires with a harder rubber and dimensions such that the pressure at the road is less, plus its normal driving patterns have less wear than typical Tesla use.
To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity. Volume lost per mile driven is basic high-school arithmetic, and the only real questions are regarding data quality and whether the self-imposed constraints (e.g., examining real-world wear rather than wear given optimal driving or something) are reasonable.
AnthonyMouse 8 hours ago [-]
> The city bus uses tires with a harder rubber and dimensions such that the pressure at the road is less
Harder rubber seems like it could make a difference, but then you could also put tires with harder rubber on a car.
You can get a heavier vehicle to have the same pressure at the road by using more and bigger tires, but then the problem is that the tires are bigger and there are more of them.
> plus its normal driving patterns have less wear than typical Tesla use.
Isn't a city bus constantly starting and stopping, both as a result of city traffic and picking up and dropping off passengers?
> To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity.
I tried plugging these in and it still comes out as a 6-wheel commercial bus has several times the tire wear as a 4-wheel light truck, rather than being the same.
And I expected the difference to be even more, but I guess that goes to show how much the weight argument is motivated reasoning if ~7x the weight is only ~3x the tire wear and then people are complaining about something which is only ~1.2x the weight.
shmeeed 6 hours ago [-]
>I tried plugging these in and it still comes out as a 6-wheel commercial bus has several times the tire wear as a 4-wheel light truck, rather than being the same.
Pardon me if I ask the obvious question, but did you divide your result by the average number of people moved? Because that's the actual utility of mass vs. individual transport. I would find it rather surprising if tire wear was the one measure were buses didn't win out.
hansvm 3 hours ago [-]
A typical city bus has something like 2500 cubic inches of tread that it burns through, compared to 650 for a Model Y. Tires typically last 500k miles, vs 50k, generously, for a Model Y. I'd said "comparable," but that was just to avoid argument. From a tire wear perspective, you're better driving a bus even if you're the only person on it.
jsight 28 minutes ago [-]
I knew that there had to be a mistake somewhere.
No bus tires to not typically last 500k miles. <100k is the norm, and really not more than a long-life car tire.
They do get retreaded more often than car tires do, but that just means they get new rubber added regularly.
AngryData 4 hours ago [-]
Ehh you can't really just put harder tires on a car and leave it at that. Harder tires means less grip, and that is a serious setback and much less safe in a car than the typical bus that runs city routes at lower speeds and less adverse road conditions.
Tire temperature also will play a big roll in tire wear, and I wouldn't expect bus tires to get very hot only rolling half the time and at a lower speed than the typical car.
And of course you also gotta factor in passenger count. Buses generally have more than just 1 or 2 people, while the vast majority of cars will have 1 or 2 people most of the time. And even if a bus tires were to wear out twice as fast as a car's tire, that is still less wear per person than a car.
jsight 27 minutes ago [-]
That's true, but it is all relative. 70k+ mile tires for cars and suvs are fairly common. They sacrifice some ride quality and performance, but not so much as to be unsafe.
floxy 17 hours ago [-]
>It's always been a thing?
Is there a way to quantify this? My experience as well is that the tire particulate pollution has mostly been an anti-EV talking point.
wat10000 3 hours ago [-]
This is normal. Once you solve the biggest problem, something else becomes the new biggest problem.
The biggest problem with tailpipe emissions used to be horrendous smog. That was mostly solved in many places, and now the biggest problem is the impact on the global climate.
The biggest issue with childhood mortality used to be disease. Now we (correctly) focus more on accidental deaths.
EVs solved tailpipe emissions, but they’re not perfect. Their biggest problem is just something else.
01HNNWZ0MV43FF 18 hours ago [-]
Well many of my fellow Americans would only accept an EV if it's gigantic, and even though I can't leave the house without seeing a Prius or a RAV4 hybrid, the news acts like it's gas versus electric as if Toyota hadn't solved this twenty years ago
keybored 18 hours ago [-]
You would wager. Based on what?
There’s been decades of lies about climate change. And once the truth got out society was already massively dependent on it. For cars specifically it was a deliberate policy to make e.g. the US car-dependent. And once the truth got undeniable the cope was switched to people’s “carbon footprint” (British Petroleum). In fact there are rumors that the cope echoes to this day.
Zoom out enough and it becomes obviously unproductive to make “mass ignorance” the locus of attention.
sanktanglia 9 hours ago [-]
People spend hours a day mindlessly scrolling social media apps(streaming video calls to boot) that also take up water and energy usage per hour but are totally disconnected from it
The real thing here is that these tools are currently in the "subsidy" phase, so the pricing doesn't reflect actual costs. Only once they've been made indispensible and impossible to remove will the prices be jacked up and the product enshittified.
paulcole 12 hours ago [-]
Obviously people have zero awareness of or interest in their true impact on the environment. This extends to every facet of life and is not at all limited to AI use.
Do you really think the average person could within 2 orders of magnitude when estimating their carbon footprint for a year?
reaperducer 17 hours ago [-]
It's unfair to single out AI for using energy.
Why? AI isn't a human being. We have no obligation to be "fair" to it.
thot_experiment 17 hours ago [-]
Yeah we do, it's basic epistemic hygiene. If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet you shouldn't be concerned about prompting an AI.
const_cast 14 hours ago [-]
Except we do care about those things. We used to get tons of PSAs for carbon footprint. Turn off the lights when you leave a room, turn off your computer overnight, turn off the faucet when you're washing your hands. That type of thing.
paulcole 12 hours ago [-]
Lol If we do care about those things why did the PSAs stop? Problem solved?
FollowingTheDao 53 minutes ago [-]
> If we do care about those things why did the PSAs stop?
Political lobbying.
reaperducer 15 hours ago [-]
Apologizing for AI boiling the oceans sounds like a lot of whataboutism.
I can picture an Elizabeth Holmesian cartoon clutching her diamond necklace.
"Oh, won't somebody think of the tech billionaires?!"
If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet
The basic premise of the modern tech industry is scale. It's not one person running a microwave for a couple of seconds, it's a few billion people running a microwave for the equivalent of decades.
weberer 2 hours ago [-]
Its just dumb to only care about energy when it comes to this very specific use case, while freely buying cheap plastic bullshit that was literally shipped from the other side of the planet.
GuinansEyebrows 18 hours ago [-]
fairness to one polluter over another isn't the real issue - look at prop 65 in california; or if you're not used to this in CA, think of any time you've been on-call. alert fatigue is real and diminishes the urgency of the underlying message.
keybored 18 hours ago [-]
You don’t need it for any pragmatic benefit because it won’t work. It doesn’t work for eating meat. It won’t work for AI.
The only purpose is to scapegoat the possible environmental or economic fallout. Might as well put it on individuals. Like what’s always done.
I’ve already seen it on the national broadcast. There some supposed experts were wagging their fingers about using AI for “fun”. Making silly images.
Meanwhile we’re gonna put AI to good use in arms races: more spam (automated applications, ads, ads, ads, abuse of services) and anti-spam. There’s gonna be so much economic activity. Disruptive.
reginald78 4 hours ago [-]
AI will probably increase GDP. In the same way shooting the windows out of everyone's houses would increase GDP. Then we can claim is grew the economy.
admiralrohan 4 hours ago [-]
We can't be pessimistic, that hinders flow. Should rather focus on creative ways to increase energy requirements. We will figure this out.
kmeisthax 18 hours ago [-]
Individual LLM requests are vanishingly small in terms of environmental impact; inference providers use a lot of batching to do lots of work at once. Furthermore, LLMs and diffusion models are not the only ML workload. While generative AI tickles investors, most of the ML actually being deployed is more mundane things, like recommendation systems, classifiers, and the like; much of which is used for adtech purposes adversarial to that of users. If LLMs and diffusers were the only thing companies used ML for, but efficiency gains from new hardware remained constant, we'd still be at the 2017 baseline for environmental impact of data centers.
Likewise, I doubt that USENET warning was ever true beyond the first few years of the networks' lifetime. Certainly if everything was connected via dial-up, yes, a single message could incur hundreds of dollars of cost when you added the few seconds of line time it took to send up across the whole world. But that's accounting for a lot of Ma Bell markup. Most connections between sites and ISPs on USENET were done through private lines that ran at far faster speeds than what you could shove down copper phone wiring back then.
troyvit 1 hours ago [-]
> Individual LLM requests are vanishingly small in terms of environmental impact;
The article uses open source models to infer cost, because those are the only models you can measure since the organizations that manage them don't share that info. Here's what the article says:
> The largest of our text-generation cohort, Llama 3.1 405B, [...] needed 3,353 joules, or an estimated 6,706 joules total, for each response. That’s enough to carry a person about 400 feet on an e-bike or run the microwave for eight seconds.
I just looked at the last chat conversation I had with an LLM. I got nine responses, about the equivalent of melting the cheese on my burrito if I'm in a rush (ignoring that I'd be turning the microwave on and off over the course of a few hours, making an awful burrito).
How many burritos is that if you multiply it by the number of people who have a similar chat with an LLM every day?
Now that I'm hungry, I just want to agree that LLMs and other client-facing models aren't the only ML workload and aren't even the most relevant ones. As you say adtech has been using classifiers, vector engines, etc. since (anecdotally) as early as 2007. Investing algorithms are another huge one.
Regarding your USENET point, yeah. I remember in 2000 some famous Linux guy freaking out that members of Linuxcare's sales team had a 5 line signature in their emails instead of the RFC-recommended 3 lines because it was wasting the internet or something. It's hard for me to imagine what things were like back then.
cj 16 hours ago [-]
If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?
Are you saying all of that new capacity is needed to power non-LLM stuff like classifiers, adtech, etc? That seems unlikely.
Had you said that inference costs are tiny compared to the upfront cost of training the base model, I might have believed it. But even that isn't accurate -- there's a big upfront energy cost to train a model, but once it becomes popular like GPT-4, the inference energy cost over time is dramatically higher than the upfront training cost.
You mentioned batch computing as well, but how does that fit into the picture? I don't see how batching would reduce energy use. Does "doing lots of work at once" somehow reduce the total work / total energy expended?
lkbm 15 hours ago [-]
> If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?
Well, partly because they (all but X, IIRC) have commitments to shift to carbon-neutral energy.
But also, from the article:
> ChatGPT is now estimated to be the fifth-most visited website in the world
That's ChatGPT today. They're looking ahead to 100x-ing (or 1,000,000x-ing) the usage as AI replaces more and more existing work.
I can run Llama 3 on my laptop, and we can measure the energy usage of my laptop--it maxes out at around 0.1 toasters. o3 is presumably a bit more energy intensive, but the reason it's using a lot of power is the >100MM daily users, not that a single user uses a lot of energy for a simple chat.
finebalance 3 hours ago [-]
> not that a single user uses a lot of energy for a simple chat.
This seems like a classic tragedy of the commons, no? An individual has a minor impact, but the rationale switching to LLM tools by the collective will likely have a massive impact.
protocolture 14 hours ago [-]
>If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?
Something to temper this, lots of these AI datacenter projects are being cancelled or put on hiatus because the demand isnt there.
But if someone wants to build a nuke reactor to power their datacenter, awesome. No downsides? We are concerned about energy consumption only because of its impact on the earth in terms of carbon footprint. If its nuclear, the problem has already been solved.
fakedang 12 hours ago [-]
> Something to temper this, lots of these AI datacenter projects are being cancelled or put on hiatus because the demand isnt there.
Wait, any sources for that? Because everywhere I go, there seems to be this hype for more AI data centers. Some fresh air would be nice.
AI seems like it is speedrunning all the phases of the hype cycle.
"TD Cowen analysts Michael Elias, Cooper Belanger, and Gregory Williams wrote in the latest research note: “We continue to believe the lease cancellations and deferrals of capacity points to data center oversupply relative to its current demand forecast.”"
FreezerburnV 14 hours ago [-]
Because training costs are sky-high, and handling an individual request still uses a decent amount of energy even if it isn't as horrifying as training. Plus the amount of requests, and content in them, is going up with stuff like vibe coding.
The article says 80-90% of data center usage for AI is for inference, and is from a more reputable source than the random blog
lkbm 11 hours ago [-]
The blog is citing specific studies for its claims. Is there an issue with those studies?
davidcbc 11 hours ago [-]
It's almost a year old at this point so at best it is horribly out of date
adrianN 1 days ago [-]
The important part remains internalizing emission costs into the price of electricity. Fussing over individual users seems like a distraction to me. Rapid decarbonization of electricity is necessary regardless of who uses it. Demand will soar anyway as we electrify transportation, heating, and industry.
seb1204 1 days ago [-]
I agree but reducing consumption or increase of efficiency are still very important aspects of the energy transition. What is not consumed does not need to be generated.
nostrademons 19 hours ago [-]
If you internalize emissions costs into the price of electricity, reduced consumption will happen naturally. Precisely nobody likes higher energy bills, so there's a natural incentive to reduce consumption as long as you're paying for it.
designerarvid 10 hours ago [-]
It would incentivise energy production that avoids those costs.
delusional 18 hours ago [-]
"internalizing emissions" is the kind of thing that's really easy to say, even conceptualize, but really difficult to implement.
You could do it better than we are doing now, but you'll always have people saying: "that's unfair, why are you picking on me"
_aavaa_ 14 hours ago [-]
Even pricing CO2 output from burning fossil gas, plus a % for upstream leaks, and the same for car combustion will go a long way.
Mind you people won't like that since we're so used to using the atmosphere as a free sewer. The idea of having to pay for our pollution isn't palatable since the gasses are mostly invisible.
Though it's sad that we're talking about market solutions rather than outright bans for the majority of applications like we did for leaded gas.
AnthonyMouse 13 hours ago [-]
Outright bans are a non-starter because it requires an infrastructure transition. You couldn't possibly replace every car with an electric one overnight, we can't make them that fast. But if you price carbon then it would cause every new car to be electric, or at least a plug-in hybrid that runs in electric mode 95% of the time. And the people who drive a lot of miles would switch to electric first, which would make a big difference right away.
Meanwhile the people with a 10 year old car they drive 5000 miles a year will keep it until it's a 20 year old car, at which point they'll buy another 10 year old car, but by then that one will run on electricity.
Then you could theoretically ban it, but by then do you even need to?
_aavaa_ 3 hours ago [-]
Nobody is talking about replacing all cars overnight.
You don't have to ban existing cars, they will phase themselves out. Give every X years and ban the sales of any non-hybrids for all but a few niche applications. Then in X+Y years ban all combustion engines other than niche applications.
But ultimately, we need to be serious about this, and half the population and the governments of most western countries are not serious. Many people still believe that climate change is a hoax, and ridiculous ideas like hydrogen cars and ammonia burning ships are still getting funding.
asdff 16 hours ago [-]
I wonder how much households can really save here. Most "luxury" items using electricity don't really use much e.g. a modern laptop or modern smartphone. The stuff that does use a lot of electricity are things like your AC unit or your electric heater and electric stove. Seems there is little wiggle room there to me, people might end up just getting saddled with higher bills especially if slightly more efficient home appliances are out of reach (or not purchased by the renter at all). And for people who might get strongly affected out of their budget by these things for lack of income there are usually subsidies to help pay for their energy usage, which might further stymie market forces from changing behavior. Seems most high energy use consumers are high enough income where they won't be much affected by increased power costs like how we see them unaffected by water restrictions and higher fees for high water usage already.
Maybe that says the fees aren't yet high enough for high income people to change behavior, but I'm willing to bet they never truly will be due to the influence this subset of the population holds over politics.
ZeroGravitas 5 hours ago [-]
All the big wins at a household level involve electrification (EV, heat pumps, induction stoves) so involve using more electricity and less fossil fuel.
bee_rider 11 hours ago [-]
Carbon taxes could be phased in over time, to give people a chance to make that decision over the course of natural appliances update lifecycles.
Even if rich people don’t consume much more energy than poor people (I have no idea, just engaging with your idea as stated), they must be buying something with their money… carbon taxes should raise the price of goods with lots of embodied carbon.
If they aren’t consuming much energy and am they aren’t buying stuff with much embodied carbon… I dunno, I guess that’s the goal, right?
wmf 12 hours ago [-]
It's not about households anyway, it's about transportation and industrial usage. Larger companies have enough scale that they can afford to invest in efficiency.
_aavaa_ 14 hours ago [-]
Some of these would benefit from changes (e.g. electric heating -> heat pump). Others would be better off with other changes. E.g. too much cooling? Consider better awnings, stronger blinds, or even IR rejecting films.
As for the stove, how much it uses is directly related to the kind of cooking you do, and for how long.
0_____0 1 days ago [-]
Sometimes I see chatter about using solar or nuclear or whatever power for data centers, thereby making them "clean," and it's frustrating that there isn't always the acknowledgement that the clean energy could displace other dirty generation.
Even with things like orphaned natural gas that gets flared otherwise - rescuing the energy is great but we could use it for many things, not just LLMs or bitcoin mining!
AnthonyMouse 13 hours ago [-]
> the clean energy could displace other dirty generation.
If you would have built 10GW of solar or nuclear to replace other generation and instead the data center operators provide funding to build 20GW so that 10GW can go to data centers, the alternative wasn't replacing any of the other dirty generation. And the economies of scale may give the non-carbon alternatives a better cost advantage so you can build even more.
0_____0 49 minutes ago [-]
If this scenario relies on DC operators to give away money, I doubt it could ever seriously represent the bulk of what the industry is doing.
JKCalhoun 2 hours ago [-]
True, but if data centers were exclusively built along places like the Columbia River, as an example, that energy, even excess energy, is practically free once you have the hydro-electric plant. I guess the same is true for solar, wind — any power generated that does not require consumables.
In some markets this might be right but in others it isn't. For instance, if you have CO2 certificates associated with a product then not buying it won't change emissions. It will make the price of certificates cheaper for everyone else and lead to other consumption elsewhere.
adrianN 1 days ago [-]
Yeah, but pricing signals are a good way of reaching those goals.
designerarvid 10 hours ago [-]
Pricing is a good way to steer in this direction. We would have more salad waste if salad was free to bring home from the supermarket I’m sure.
Scarblac 1 days ago [-]
Businesses will only start doing that in significant amounts when carbon emissions are priced according to their environmental impact.
Lerc 1 days ago [-]
I don't think it a given that reducing energy consumption is a required part of the transition.
Increasing demand can lead to stimulus of green energy production.
Uehreka 19 hours ago [-]
There’s no rule that increased demand will necessarily stimulate green energy production, only that it will stimulate energy production. And getting people to care about climate gets tougher, not easier, when energy demand goes up.
7 hours ago [-]
hermitShell 14 hours ago [-]
To do that we would need society to agree about what the emission cost is.
Making electricity so abundant and efficient is probably more solvable. You can’t solve stu… society
11 hours ago [-]
11 hours ago [-]
whimsicalism 2 hours ago [-]
especially when you consider that most of these models are trained/running on contracts for exclusively renewable energy.
designerarvid 10 hours ago [-]
Agreed, that always been the key question for sustainability. Price is a fantastic mechanism, but negative externalities remain.
somewhereoutth 17 hours ago [-]
Indeed. However the problem with LLMs is that vast amounts of VC money are being thrown at them, in the [misplaced] hope of great returns. This results in a resource mis-allocation of biblical proportions, of which unnecessary carbon emissions are a part.
panstromek 6 hours ago [-]
There's a certain irony here in the fact that this page is maxing out my CPU on idle, doing some unclear work in javascript, while I'm just reading text.
guappa 5 hours ago [-]
I would like browsers to have very strict limits on the amount of cycles a page can use, and have some permission like for camera to use more CPU.
sligor 5 hours ago [-]
I would like also that devs run their site/apps on downgraded machine/VM so that they can early detect performance issues.
turtletontine 1 hours ago [-]
I’ve realized how huge a problem this is by spending time with older relatives recently. Many of them have >5 year old phones (why upgrade?), that current websites and apps often just don’t run on. They SHOULD be able to run - they just don’t.
guappa 4 hours ago [-]
In my company the frontend team has shiny new powerful machines. I on the other hand don't have enough RAM to compile our server.
jve 5 hours ago [-]
What a nice idea.
tessellated 4 hours ago [-]
top joke!
coolcase 4 hours ago [-]
Run the browser in a cgroup?
guappa 4 hours ago [-]
I don't want the whole browser to be stopped or killed. What would that accomplish?
(cgroups, as per a sibbling comment, are addressed in this write-up as "not maximally satisfying")
xeox538 7 hours ago [-]
I believe we're currently seeing AI in the "mainframe" era, much like the early days of computing, where a single machine occupied an entire room and consumed massive amounts of power, yet offered less compute than what now fits in a smartphone.
I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.
whatnow37373 2 hours ago [-]
The CPU development curve is often thrown around but it very seldomly fits anything else in reality. It was a very rare and extraordinary set of coincidences that got it us here. Computation using silicon turned out to have massive growth potential for a variety of lucky reasons but say battery tech is not so lucky, nor is fusion nor is quantum computing.
The low hanging fruit has been plucked by said silicon development process and while remarkable improvement in AI efficiency is likely it is highly unlikely for that to follow a similar curve.
More likely is slow, incremental process taking decades. We cannot just wish away billions of parameters and the need for trillions of operations. It’s not like we have some open path of possible improvement like with silicon. We walked that path already.
Maybe photonics..
righthand 35 minutes ago [-]
I don’t understand the “chips designed for AI workloads” sentiment I hear all the time. Llms were designed using Gpus. The hardware already exists, so what will make it use less energy in a world where Gpus over the last decade have only become bigger, hotter, more power hungry hardware? If we could develop Llm on anything less we probably would have shifted back to Cpus already.
panstromek 6 hours ago [-]
It sure seems like that to me. I was pretty impressed by how easily I could run small Gemma on 7 year old laptop and get a decent chat experience.
I can imagine that doing some clever offloading to a normal programs and using the LLM as a sort of "fuzzy glue" for the rest could improve the efficiency on many common tasks.
coolcase 4 hours ago [-]
Big tech ain't investing heavily so you can run local, what data does that leave them to sell, and what power and control does that give them. Zilch.
neves 13 hours ago [-]
Best article I've ever read about the energy needs of AI.
Impressive how Big Tech refuses to share data with society for collective decisions.
I'd also recommend the Data Vampires podcast series:
The number of people in this comment thread defending this gargantuan energy footprint for a technology that currently in large measure is being used for a tremendous amount of dogshit things (and oceans of visual/text spam) is amusing considering the hysterics that this same energy use problem caused when it came to crypto.
I guess it becomes okay when the companies guzzling the energy are some of the biggest tech employers in the world, buttering your bread in some way.
vanschelven 4 hours ago [-]
At least in the case of AI the _theoretical_ benefits are there, whereas crypto is wasteful _by design_.
Whether the cost/benefit works out in the case of AI is another question.
phillipcarter 3 hours ago [-]
Yeah, it's a different compute/cost curve worth considering.
On one hand, the cost of compute per token has gone down a lot, and will continue to go down, because that's exactly the economic incentives at play. We had a little short-term nonsense where "the bigger the better" was all the rage, but inference was never this way, and now training is also pushing in this direction.
But on the other hand, less compute per token means it can be more broadly deployed. And so there is likely more energy use, not less, in the long run.
DavidSJ 2 hours ago [-]
I agree with you that AI has much larger theoretical benefits than cryptocurrency, but I don’t think it’s fair to say cryptocurrency is wasteful by design. Bitcoin’s proof of work serves a vital function: securing the network from double-spending attacks. At the time of bitcoin’s invention, it was the only known solution to that problem, so it’s no more “wasteful by design” than a bank hiring a security guard. There are admittedly alternatives to proof of work today; I’m unsure how well they work by comparison, but even if they suffice for security, that only means that bitcoin is wasteful due to being more primitive technology, not by design.
I remember when the tenor on HN comments toward crypto was very positive. It was not always negative.
skybrian 1 hours ago [-]
Unlike Bitcoin, which burns more energy for the same usage as the price goes up, LLM inference prices are dropping rapidly [1] because AI companies have big incentives to cut costs.
Companies like Apple and Google are both building data centers and trying to make on-device AI a thing. Unfortunately, they also keep inventing new, more expensive algorithms.
It’s at least plausible that most LLM use will become cheap enough to run on battery-limited devices like laptops and phones, though it’s not what most people are betting on.
Good that you connected it with crypto boom. But problem is it's hard to rewind these kind of trends once started. Human nature.
pjc50 3 hours ago [-]
NFTs are dying and link rotting. ICOs are out of fashion.
VR is back in its niche.
3DTV .. I've not seen that marketed for quite a while now. Things that are fads will die out eventually.
Crypto meanwhile is in an odd space: everyone knows blockchains are too much of a hassle and that the volatility is too high, so they're using centralized exchanges and "stablecoins" (shadow dollars). There's still a huge amount of money there but not quite as much as the stadium ads / FTX peak.
alnwlsn 2 hours ago [-]
I wish you could still buy 3D TVs. Why? We used them to build a truck simulator which had a driver and passenger. The view out the window is different for each person. They were perfect for that, but it's too small a niche to keep making new TVs.
ezfe 3 hours ago [-]
Crypto energy usage (bitcoin, proof of work) is literally a waste lol
DonaldFisk 1 hours ago [-]
Bitcoin, yes. But Ethereum uses proof of stake, which has around 0.1% of proof of work's energy requirements. If only there was a way of reducing AI's energy requirements by a similar amount.
janaagaard 5 hours ago [-]
- "4.4% of all the energy in the US now goes toward data centers"
- "by 2028 [...] AI alone could consume as much electricity annually as 22% of all US households."
What would the 22% be if compared against all US energy instead of just all US household?
EdiX 2 hours ago [-]
US households constitute 21% of all energy use [1], so 22% of this is 0.21 * 0.22, which is 4%.
> In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.
As we all know, the generative AI boom only really kicked into high gear in November 2022 with ChatGPT. That's five years of "AI" growth between 2017 and 2022 which presumably was mostly not generative AI.
mattnewton 10 hours ago [-]
2017 is the year after AlphaGo beats Lee Sedol and is when the attention is all you need paper was published. The writing was on the wall. OpenAI just found product market fit in november 2022, but the industry wasn't wandering aimlessly until then.
SoftTalker 15 hours ago [-]
People started using GPUs for ML at around that time.
NitpickLawyer 8 hours ago [-]
Meta was in the process of AI-ing everything on their stacks, with search, similarity, graphs, recommendation, etc. everywhere on their properties. Incidentally that's why they already had tens/hundreds of thousands of GPUs when the LLM craze hit, and why they were in a good place to work on llama and other stuff.
thedevilslawyer 8 hours ago [-]
What's the energy imprint of the human doing the same work that AI is now going to do? If AI's imprint is less, what is the right thing for us to do?
mxfh 51 minutes ago [-]
My go-to example is when some EU initiatives proposed labeling mobile phones by energy use. It completely missed the forest for the trees, as a prime example of overoptimization if your goal is carbon emissions reduction.
Nearly any other daily activity of a consumer in the developed world uses orders of magnitude more energy and resources than scrolling TikTok on a phone.
Examples?
– Driving to work: commuting burns far more fuel in a week than your phone uses in a year.
– Gym sessions: heated, lit, air-conditioned spaces plus transit add up quickly.
– Gaming or watching TV: bigger screens, bigger compute easily 100x and higher power needs vs phone gaming.
– Casually cooking at home: using a metric ton of appliances (oven, stove, fridge, pans) powered like twice a week, replaced every ~10 years.
– Reading print media: a daily newspaper or weekly book involves pulp, ink, shipping, and disposal.
– Streaming on a laptop or smart TV: even this draws more power than your phone.
– Taking a shower: the hot water energy use alone dwarfs your daily phone charge.
Of couse not doing any sports or culture is also not what societies want, but energy wise a sedentary passive tiktok lifestyle is as eco friendly as it get's vs. any other real world example.
Phones are basically the least resource-intensive tool we use regularly.
Externalities, context, and limited human time effects matter a lot more than what one phone uses vs the other.
Even e-readers already break even with books after 36 paper equivalents
I've been thinking about this. If the human-equivalent of training an LLM is sending hundreds/thousands of students through college for many years, I can't help but think that the energy needed for both outcomes is comparable.
ahtihn 7 hours ago [-]
Kill the human?
If you don't want to go there, it doesn't really matter how much energy the human uses because the human will just use the same energy to do something else.
zavec 7 hours ago [-]
Not necessarily. I think the point of comparison here is how much energy does AI use to e.g. generate a video, compared to the energy used not by the human themselves, but by running XYZ software on a computer with a beefy graphics card for however many hours it'd take a human to do the same work.
shmeeed 4 hours ago [-]
While that's a valid point of view, as long as the human is the bottleneck, it's not going to scale to infinity and beyond.
Human's got to exist and needs to work to eat. They don't really, necessarily, existentially need to be 10x productive with the help of AI.
But I'll be honest, that's not really a solid argument, because it could rapidly lead to the question of why they do this exact job in the first place, instead of e.g. farming or whatever else there might be that can be called a net positive for humanity without reservations.
thedevilslawyer 5 hours ago [-]
Indeed. Everything taken into account - work getting done takes energy. And if an agent can do a task for less energy than a human, then indeed it's a benefit. This would be the apt comparison, instead of looking at some overall datacenter energy consumption.
HPsquared 2 hours ago [-]
Or a team going to a location and filming, etc.
coolcase 4 hours ago [-]
The human is alive anyway so zero extra footprint.
mentalgear 1 days ago [-]
When companies make ESG claims, sensible measurement and open traceability should always be the first proof they must provide. Without these, and validation from a credible independent entity such as a non-profit or government agency, all ESG claims from companies are merely PR puff pieces to keep the public at bay (especially in "AI").
stevage 1 days ago [-]
esg?
JohnFen 1 days ago [-]
Environmental/Social/Governance. From Wikipedia:
Environmental, social, and governance (ESG) is shorthand for an investing principle that prioritizes environmental issues, social issues, and corporate governance.
kkarakk 19 hours ago [-]
when has ESG not been FUD or a way to bypass sanctions from poorly thought out climate change targets?
dr_dshiv 6 hours ago [-]
Solving climate change will take a lot of energy.
I found this article to be a little too one sided. For instance, it didn’t talk about the 10x reductions in power achieved this past year — essentially how gpt4 can now run on a laptop.
Viz, via sama “The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger.”
https://blog.samaltman.com/three-observations
Fraterkes 5 hours ago [-]
The current AI-boom has been aroung for ~3 years. Implying that effeciency gains in AI will be like Moore's law (ie they will accrue at around the same high rate for decades) based on data from that time frame is pretty unresponsible in my opinion. Also, Altman is not in any way an expert in any of this and has an obvious financial interest in making this technology palatable. There's no good reason to cite his writing in these kinds of discussions.
dr_dshiv 4 hours ago [-]
Distillation seems like a pretty robust approach. There is no good reason to think that GPT5 or 6 won’t have massive rapid efficiency gains.
He’s one of the most knowledgeable persons in the world on the topic. It’s like saying that the ceo of BMW isn’t citable on conversations about expected cost decreases in electric cars.
Any environmental topic is susceptible to a huge amount of groupthink. The world isn’t so binary as people make it out to be. It is far from truth that LLMs=bad for environment, any more than computers=bad for the environment.
Fraterkes 6 minutes ago [-]
Alright, so can you actually name some numbers so that the stuff you are claiming is at least fasifiable?
Altman mentions a 150x increase in efficiency, you claim that trend will continue through to gpt6.
At that point these models would be 22500 as efficient as they currently are, which would mean generating a 10 hour long video would cost around the same amount of electricity as running your microwave for 15 minutes. Will you have some introspection if that doesn't come to pass?
wyre 2 hours ago [-]
I think its more like saying that the ceo of Tesla isn't citable on conversations about the expected advancement in electric car technology.
What I mean is that I have a healthy level of skepticism with Altman. He has to constantly battle for funding. Surely, he must be knowledgeable about LLMs, but he's the CEO of the largest AI company in the world, his PR needs to give "most knowledgeable person in the world on the topic" but I think that title should go to all of the engineers and developers working on these technologies and not a capital founder.
All that said, I agree that LLM's being bad for the environment is a complex topic. I think it would be more accepted if people had safety nets and could be excited for AI to take their job instead of having to be terrified, or if AI isn't just used as another tool for increasing wealth inequality.
dr_dshiv 1 hours ago [-]
One reason to take these expected cost reductions more seriously is because it DOESN'T fit the desired narrative for OpenAI. Ideally, from a front-runner competition perspective, it would NOT be possible to create the same level of quality at < 1/10 the cost/energy within a year. But, distillation and other approaches seems to make this possible. There is a LOT of room for optimization with LLMs.
Juliate 6 hours ago [-]
It's still _additional_ power usage that
1/ did not exist before
2/ does not replace/reduce previous/other power (some, very much more critical and essential) usages.
3/ a LOT of tasks are still way more energy/time-efficiently done with regular existing methods (dedicated software, or even by hand), but are still asked/improperly routed to AI chatbots that ... statistically guess the answer.
dr_dshiv 5 hours ago [-]
It is also additional value creation.
It also leads to automation and efficiency, even if it isn’t a fully linear path.
AI isnt a waste. We can’t let environmental consciousness get in the way of rather natural human development. Especially CO2. (I have different opinions about biodiversity because of the irreversible loss. I also believe that we have the technology to pause and reverse climate change — but don’t pursue it because of degrowth ideologies)
cdblades 1 hours ago [-]
> It is also additional value creation.
Give some concrete examples and stats?
> It also leads to automation and efficiency, even if it isn’t a fully linear path.
Ditto.
3 hours ago [-]
Juliate 3 hours ago [-]
> It is also additional value creation.
For _who_ really?
Shareholders of AI tools producing companies?
Shareholders of companies that pretend to replace people and verifiable working processes with poorly understood black boxes?
(I can't help but notice the _same_ playbook as with crypto, NFTs, Web3, metaverse, and the same enabling-hardware provider).
Value, automation, efficiency will not solve the climate change challenges, if they are not directed towards it aggressively, as well as humanity acceptance and well-being.
Alas, they are directed towards a very little few bank accounts. Violence and subjugation, in many of their forms, is directed towards the others. It's not by accident.
t_tsonev 5 hours ago [-]
No amount of value, in the economic sense, will solve climate change.
SirHumphrey 3 hours ago [-]
Well, somebody must buy solar panels, batteries, (nuclear reactors) and finance the whole research process.
While “economic value” =/= “solving climate change” without enough tax revenue costly transitions are impossible.
Juliate 3 hours ago [-]
Very capitalistic reasoning.
It's revealing that for some, it's easier to imagine Earth without life than Earth without capitalism.
teekert 19 hours ago [-]
I wonder what the carbon footprint of all those ads is.
amelius 19 hours ago [-]
Not just the ads, but also the overconsumption which they cause.
__MatrixMan__ 14 hours ago [-]
Not just overconsumption, but also waste due to supply chain fragility. If you can induce demand anywhere then supply has to do crazy things to keep up.
mg 1 days ago [-]
The brain uses 20% of the human body's energy.
I wouldn't be surprised if mankind will evolve similar to an organism and use 20% of all energy it produces on AI. Which is about 10x of what we use for software at the moment.
But then more AI also means more physical activity. When robots drive cars, we will have more cars driving around. When robots build houses, we will have more houses being built, etc. So energy usage will probably go up exponentially.
At the moment, the sun sends more energy to earth in an hour than humans use in a year. So the sun alone will be able to power this for the foreseeable future.
HPsquared 2 hours ago [-]
First-order effect is that quality of life will improve though as a result of all that work being done. People able to live more comfortably, relax more etc.
The main complaint about energy usage is it will damage the environment, which will (indirectly) reduce quality of life.
It's a question of which factor wins.
Scarblac 1 days ago [-]
But the article says that energy use by AI is 48% more carbon intensive than the US average. So talk of solar power is a red herring -- that's not what it is running on now.
mg 1 days ago [-]
I am thinking about the future here.
I don't think there will be much carbon intensive energy creation in a few decades from now. It does not make sense economically.
Scarblac 1 days ago [-]
You said "for the foreseeable future", which I interpret as being about now.
Anyway I hope you're right, but so far global CO2 output is still growing. All the other energy has only come on top of carbon intensive energy, it hasn't replaced any of it. Every time we build more, we find new ways of spending that much energy and more.
mg 1 days ago [-]
Seeing 20 years into the future is quite possible in some aspects.
I remember how me and my friends discovered email in 1999 and were like "Yay, in the future we'll all do this instead of sending letters!". And it took about 20 years until letters were largely replaced by email and the web. And when the first videos appeared on the web, it was quite clear to us that they would replace DVDs.
Similar with the advent of self driving cars and solar energy I think.
newtonsmethod 15 hours ago [-]
The energy use by AI probably is just as, if not more, carbon intensive, but the article never says that. It talks about the energy use of the general data center.
> The carbon intensity of electricity used by data centers was 48% higher than the US average.
AnthonyMouse 13 hours ago [-]
In case anyone is wondering why that is, it's because they put data centers in the places with the cheapest electricity. Which, in the US, is in places like Virginia and Ohio, where they burn fossil fuels.
If the people always talking about how cheap solar is want to fix this, find a way to make that cheapness actually make it into the customer's electric bill.
ipdashc 10 hours ago [-]
I've always wondered why data centers aren't taking off more in places like Iceland (cheap geothermal) or Quebec (cheap hydro). Both of these places are also pretty cold and one would think this benefits cooling.
There are periodically news articles and such about data centers in Iceland, of course, but I get the impression it's mostly a fad, and the real build-outs are still in Northern Virginia as they've always been.
The typical answer I've seen is that Internet access and low latency matter more than cooling and power, but LLMs seem like they wouldn't care about that. I mean, you're literally interacting with them over text, and there's already plenty of latency - a few extra ms shouldn't matter?
I'd assume construction costs and costs of shipping in equipment also play a role, but Iceland and Canada aren't that far away.
fallingknife 9 hours ago [-]
How much bandwidth is there in Iceland? I suspect not much because the population is only 400K. You will need to lay new undersea fiber. And how are you going to build them? The construction alone would take a massive amount of resources and manpower not feasibly available there. And what about the power supply? In data center heavy areas like Virginia, data center power consumption is already 25% of the entire state power consumption, and VA has 22x more people than Iceland. So if you build even 1/5th the number of data centers in just Virginia, that will consume the entire power grid of Iceland. Therefore, in addition to the data centers themselves, you are also going to have to build an entirely new grid and distribution system.
This assumes no technological adaptions towards efficiency. Consider yourself walking a mile and the energy expenditure. It isn't insignificant. Now imagine you have a bicycle. Some bicyclists will train and do century rides, a distance that were never possible merely walking for a day. But these are few bikers overall, most will not maximize capability to that extent but will still be taking advantage of the efficiency of the bike.
vmg12 1 days ago [-]
> When robots drive cars, we will have more cars driving around
This doesn't seem true. In SF, waymo with 300 cars does more rides than lyft with 45k drivers. If self driving cars interleave different tasks based on their routes I imagine they would be much more efficient per mile.
floxy 17 hours ago [-]
>This doesn't seem true.
Seems like we are way too early in the adoption curve to tell. Currently the average number of passengers per trip is >1.0 across the whole fleet. Some day, I'd expect that to dip below 1.0, as people send an empty car to pick up the dog from the vet, or circle the block to avoid having to pay for parking, etc.
asdff 18 hours ago [-]
If waymo is doing more rides with 300 cars than 45k drivers on lyft, we can assume then that waymo cars are on the road serving customers at least 150x as long of time as a lyft driver. So yes it could really mean more cars are "around" even if the fleet is much smaller.
0_____0 1 days ago [-]
Is it really only 300 cars? They feel like they're everywhere!
> With more than 700 vehicles in its fleet - 300 of which operate in San Francisco - Waymo is the only U.S. firm that runs uncrewed robotaxis that collect fares.
Those numbers are from April 2025.
floxy 17 hours ago [-]
>We’ve also incrementally grown our commercial fleet as we’ve welcomed more riders, with over 1,500 vehicles across San Francisco, Los Angeles, Phoenix, and Austin.
Thank you for this data point. It massively lowers the embodied carbon footprint (carbon from manufacturing, supply chain, transportation, etc.). Operational carbon is a solved problem; it is easy to measure and can be supplied from renewable sources.
mg 1 days ago [-]
Existing rides will be done more efficiently but since rides are so much cheaper without a driver, much more rides will be done.
A car driving from A to B will cost less than 50% of the current price. Which will unlock a huge amount of new rides.
fallingknife 9 hours ago [-]
That would mean that 5.5% of the SF population are lyft drivers
amelius 1 days ago [-]
One problem: all this energy is eventually turned into heat ...
mg 1 days ago [-]
Most of the sunlight that hits a roof is already turned into heat. Whether you use that for calculations or not does not make a difference.
Not sure about the exact numbers, but I guess that at the moment normal roofs and solar panels absorb very roughly about the same percentage of sunlight.
So if in the future solar panels become more efficient, then yes, the amount of sunlight turned into heat could double.
Maybe that can be offset by covering other parts of earth with reflective materials or finding a way to send the heat back into the universe more effectively.
amelius 1 days ago [-]
What if you put a solar farm in a desert, though?
And also, people should paint their roofs white.
carunenjoyerlp 8 hours ago [-]
>And also, people should paint their roofs white.
Some could, but a dark roof can be beneficial in the winter
briandear 1 days ago [-]
Why not nuclear?
natmaka 10 hours ago [-]
Because a mix of renewables deployed on a continent (most grids tend nowadays to extend on whole continents because it enhances continuity of service while offering many ways to optimize for emissions, costs...) is better (cheaper, less dependency towards any fuel, less risk related to accidents/neglects/mistakes/war/terrorism/hot waste/... ...).
Building and running a nuclear reactor involves a lot of physical activity. And if the past is an indicator, we always move from physical activity to the flow of electrons.
The discussion about nuclear vs solar remind me of the discussions about spinning HDs versus solid state drives when they were new.
carunenjoyerlp 1 days ago [-]
HDDs build the backbone of all large storage systems, they serve many purposes today. Magnetic tape is still in use too
palmfacehn 44 minutes ago [-]
From "The Limits to Growth" to peak oil and beyond: As a rule, scarcity doom and neo-malthusianism hasn't played out the way proponents claim. I've been very critical of the AI hype cycle here and elsewhere, but this isn't it. In the long run technological advancements increase our productivity and quality of life.
Yes, much of what is being promoted is slop. Yes, this bubble is driven by an overly financialized economy. That doesn't preclude the possibility of AI models precipitating meaningful advancements in the human condition.
From refrigeration to transportation, cheap and abundant energy has been one of the major driving forces in human advancement. Paradoxically, consuming cheap energy doesn't reduce the amount of energy available on the market. Instead it increases the size of the market.
Ericson2314 19 hours ago [-]
We really need to improve the power grid. I don't think about "A. I." very much, but I am glad that something is making us upgrade the grid.
lucb1e 16 hours ago [-]
> I am glad that something is making us upgrade the grid
A few big consumers in centralized locations isn't changing the grid as much as the energy transition from fuels to electricity is
Ericson2314 16 hours ago [-]
The transition to electric vehicles in the US has been disappointing, to say the least.
lucb1e 16 hours ago [-]
I thought everyone over there has an air conditioning system on electricity?
Ericson2314 13 hours ago [-]
Yes, but we already do. We need new electricity demand.
4 hours ago [-]
floxy 17 hours ago [-]
40% of electricity consumption in Virgina will be data centers in 2030?
> When you ask an AI model to write you a joke or generate a video of a puppy, that query comes with a small but measurable energy toll and an associated amount of emissions spewed into the atmosphere. Given that each individual request often uses less energy than running a kitchen appliance for a few moments, it may seem insignificant.
> But as more of us turn to AI tools, these impacts start to add up. And increasingly, you don’t need to go looking to use AI: It’s being integrated into every corner of our digital lives.
Forward looking, I imagine this will be the biggest factor in increasing energy demands for AI: companies shoving it into products that nobody wants or needs.
Uehreka 19 hours ago [-]
In the short term perhaps, but even without carbon pricing the raw electricity prices will probably tamp down the enthusiasm. At someone point it’ll become cool for activist investors to demand to see ROI for AI features on earnings calls, and then the fat will get trimmed just like any other trend that goes too far.
I think the bigger underrated concern is if LLMs fall into an unfortunate bucket where they are in fact generally useful, but not in ways that help us decarbonize our energy supply (or that do, but not enough to offset their own energy usage).
xarope 11 hours ago [-]
I have zero use for the AI summary that google chooses to prepend to my search summary. I decided to fact check a few, and they were totally wrong in subtle ways, either changing a word or phrase and thus inverting the meaning, or else completely summarizing from a site which had no relevance to my search.
So yes, I'd like to disable this completely. Even if it's just a single birthday candle worth of energy usage.
Nition 11 hours ago [-]
> increasing energy demands for AI: companies shoving it into products that nobody wants or needs
I think of this a little every time Google gives me another result with the AI summary and no option for anyone to turn it off. Apparently worldwide there are 8+ billion searches every day.
keybored 17 hours ago [-]
Try to buy something that isn’t wrapped in three layers of plastic. Or that isn’t made of plastic itself. Then go to the checkout and see their “PSA” about how asking for a plastic bag to carry your plastic merchandise kills the planet.
I’m sorry. I’m being blocked by some mysterious force from understanding what “actual human” means. And I don’t know how to get you in contact with your car manufacturer. Would you like me to repeat my 20 step suggestion on how to troubleshoot “why does my shitty car put the A/C on freezer mode whenever “Indian Summer” tops the charts in Bulgaria”, but with more festive emojis?
carunenjoyerlp 1 days ago [-]
>you might think it’s like measuring a car’s fuel economy or a dishwasher’s energy rating: a knowable value with a shared methodology for calculating it. You’d be wrong.
But everyone knows fuel economy is everything but a knowable value. Everything from if it has rained in the past four hours to temperature to loading of the vehicle to the chemical composition of the fuel (HVO vs traditional), how worn are your tires? Are they installed the right way? Are your brakes lagging? The possibilities are endless. You could end up with twice the consumption.
By the way, copy-pasting from the website is terrible on desktop firefox, the site just lags every second, for a second.
GuinansEyebrows 18 hours ago [-]
fuel economy, like blood glucose levels, is impacted by many factors, but you can measure it over time. you might not be able to prescribe a course of action but you can make corrections to the course you're already on.
jeffbee 16 hours ago [-]
This series of articles is driving me insane. The authors or editors are using inappropriate units to shock readers: billions of gallons, millions of square feet. But they are not putting the figures into context that the reader can directly comprehend. Because if they said the Nevada data centers would use 2% as much water as the hay/alfalfa industry in Nevada then the entire article loses its shock value.
panstromek 5 hours ago [-]
I agree. Even first few paragraphs strike me as intentionaly misleading. The whole "AI energy saga" seems to be full of manipulative claims. I can't help to feel that the intent is just to color the AI as bad in whatever way possible. It feels like it's deriving the research from predetermined conclusions
teleforce 13 hours ago [-]
> unprecedented and comprehensive look at how much energy the AI industry uses
Not sure about comprehensive claim here if end-to-end query chains were not considered.
For example the mobile wireless node (that're being used by the majority of the users) contribution to the energy consumption are totally ignored. The wireless power amplifier or PA for both sides of users and base-stations are notorious for their inefficiency being only less than than 50% in practice although in theory can be around 80%. Almost all of the current AI applications are cloud based not local-first thus the end users energy consumption and contribution are necessary.
johnny_________ 12 hours ago [-]
I ponder this a lot, but the interface of "MIT technology Review" is unbearably overdesigned, its got that annoying narrow smartphone format where you can't zoom out, and then all these fancy graphics. Can't we have crisp, easy-to-read HTML? The format annoyed me so much I didn't read the article because this kind of design makes me doubt the source. Alas
myaccountonhn 6 hours ago [-]
The article works fine with the browser's reader-mode for me.
giantg2 1 days ago [-]
With all the issues and inefficiencies listed, there is a lot of room for improvement. I'm hopeful that just as the stat they give for data center energy not rising from 2005-2017, so to will the AI energy needs flatten in a few years. GPUs are not very efficient. Switching to more task specific hardware will provide more efficiency eventually. This is already happening a little with stuff like TPUs.
phillipcarter 18 hours ago [-]
Well, this was disappointing:
> There is a significant caveat to this math. These numbers cannot serve as a proxy for how much energy is required to power something like ChatGPT 4o.
Otherwise this is an excellent article critiquing the very real problem that is opacity of these companies regarding model sizes and deployments. Not having an honest accounting of computing deployed worldwide is a problem, and while it's true that we didn't really do this in the past (early versions of Google searches were undoubtedly inefficient!), it's not an excuse today.
I also wish this article talked about the compute trends. That is, compute per token is going significantly down, but that also means use of that compute can spread more. Where does that lead us?
protocolture 15 hours ago [-]
Weird I was assured that Bitcoin would be using all of the worlds electricity by now.
Which I already thought was odd, because London would need all that electricity to see through the giant mountain of poop piled up by all the horses the british use for transportation.
blkhawk 18 hours ago [-]
the numbers in the article are all over the place. I mean the article seems to try and some of the more general calculations on paper should work out but especially the image gen ones I can sorta disprove with my own experiences in local gen.
Even were it matches sorta (the 400 feet e-bike thing) that only works out for me because I use an AMD card. An NVIDIA card can have several times the generation speed at the same power draw so it all falls down again.
And the parameters they tried to standardize their figures with (the 1024x1024 thing) is also a bit meh because the SAME amount of pixels in a different aspect ratio can have huge variations in gen speed and thus power usage. for instance for most illustrious type checkpoints the speed is about 60% higher at aspect ratios other than 1024x1024. Its all a bit of a mess.
mNovak 14 hours ago [-]
This gives me the silly idea to go try to measure the power consumption of the local data center by measuring the magnetic field coming off the utility lines.
emushack 1 days ago [-]
I would like to see more data centers make use of large-scale Oil Immersion-Cooling. I feel like the fresh water use for cooling is a huge issue.
Isn't water just the transfer medium between the server and the heat exchangers outside? How would changing that to oil help?
chneu 22 hours ago [-]
It wouldn't really help.
Oil might be able to carry more heat but it's more expensive to use.
Oil immersion is something nerds like to think is amazing but it's just a pain in the ass for negligible benefits. Imagine the annoyance of doing maintenance.
asdff 18 hours ago [-]
Wouldn't it be no different but your hands get a little oily? Say you take out a ram stick, oil goes into the empty dimm slot, but so what because its displaced again when you put in the new ram stick.
polski-g 2 hours ago [-]
Seems like a solved problem. We have an unlimited sources of clean power from nuclear and solar. China alone now has almost 1TW of solar capacity.
Build more nuclear, build more solar. Tax carbon.
scottcha 14 hours ago [-]
Shameless plug . . . I run a startup who is working to help this https://neuralwatt.com We are starting with an os level (as in no model changes/no developer changes required) component which uses RL to run AI with a ~25% energy efficiency improvement w/out sacrificing UX. Feel free to dm me if you are interested in chatting either about problems you face with energy and ai or if you'd like to learn more.
est31 1 days ago [-]
I wonder how the energy requirements are distributed between training and inference. Training should be extremely flexible, so one can only train when the sun shines and nobody uses the huge amount of solar power, or only when the wind turbines turn.
jnsaff2 1 days ago [-]
AFAICT the energy cost of training is still fairly low compared to cost of GPU's themselves so especially during a land grab it's important to drive as near as possible full utilization of the GPU's, energy be damned.
I doubt this is going to change.
That said, the flip side of energy cost being not a big factor is that you could probably eat the increase of energy cost by a factor of say 2 and this could possibly enable installation of short term (say 12h) battery storage to enable you to use only intermittent clean energy AND drive 100% utilization.
17 hours ago [-]
Henchman21 18 hours ago [-]
I work in DCO, thats Data Center Operations if you’re not aware. I’ve tried explaining the amount of power used to my elderly mom; it isn’t easy! But here’s my best take:
The racks I am personally responsible for consume 17.2kW. That’s consistent across the year; sure things dip a bit when applications are shut down, but in general 17.2kW is the number. Presuming a national average of 1.2kW per home, each rack of equipment I oversee could potentially power 14 houses. I am responsible for hundreds of these racks, while my larger organization has many thousands of these racks in many locations worldwide.
I’ve found no other way to let the scale of this sink in. When put this way she is very clear: the price isn’t worth it to humanity. Being able to get, say, Door Dash, is pretty neat! But not at the cost of all our hoarded treasure and certainly not at the cost of the environment on the only planet we have access to.
The work done by AI will only ever benefit the people at the top. Because to be frank: they won’t share. Because the very wealthy have hoarding disorder.
jahewson 11 hours ago [-]
But it can’t really power “14 houses” because people in those houses are consuming external services such as those provided by your racks.
Unless your racks can only serve 14 customers.
jeffbee 17 hours ago [-]
It seems like you are having an emotional response to not understanding the general energy picture. For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).
Each!
We are in no meaningful sense torching the biosphere to get AI.
globnomulous 11 hours ago [-]
> It seems like you are having an emotional response to not understanding the general energy picture.
This is condescending and rude. It also strikes me as obviously wrong. The 'response' isn't emotional in the slightest; it just explains the emotional and cognitive experience of acquiring understanding. It's a reasonable, well-reasoned explanation of the difficulty of intuitively grasping how much energy these data centers, and thus the companies that use them, consume -- and then of the shock many experience when it dawns on them.
> For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).
> Each!
> We are in no meaningful sense torching the biosphere to get AI.
What exactly is the reasoning here? That airplanes use a lot of energy, just like data centers or compared to data centers, and therefore that AI isn't ecologically damaging -- or isn't "meaningfully" damaging, whatever that means? That's not just wrong. It's a nonsequitur.
There's a simpler, easier, better way to wrap our heads around the data, one that doesn't require false dichotomies (or something like that): humans are torching the biosphere both to get AI and to travel.
myaccountonhn 6 hours ago [-]
Well said. Thank you.
jeffbee 2 hours ago [-]
> What exactly is the reasoning here? That airplanes use a lot of energy, just like data centers ...
No. It is not "just like data centers". That is my point. The amount of energy used by transportation is several orders of magnitude more than the energy used by information technologies. The energy used to fly back and forth to Nevada to write this whiny article was more than was needed to train some of the latest models. The whole topic is totally nonsense.
Henchman21 35 minutes ago [-]
It seems like you’ve missed my point utterly. My explanation was targeted at my elderly mom, not at Hacker News readers. Her responses were genuinely hers, and I think they are reasonable given her understanding and my explanation. I apologize this didn’t come across and I’ll have to rethink how I phrase something like this in the future.
That said, your response was emotional as well: as if your words were cast down from on high, a gift from the gods! Your arrogance and rudeness should inspire some self-examination, but I am not hopeful on that front.
fallingknife 10 hours ago [-]
But if you substitute the pre-doordash system of calling up a restaurant and ordering delivery or take out, the energy savings aren't even 1%. One gallon of gas contains 34 kWh of energy, so if one delivery takes 0.5 gallons of gas, it uses enough energy to power one of your racks for an hour. How many doordash orders can be processed by one of your racks in an hour? It's got to be in the millions.
cco 16 hours ago [-]
Today Google launched a model, Gemma 3n, that performs about as good as SOTA models from 1-2 years ago that runs locally on a cell phone.
Training SOTA models will, like steel mills or other large industrial projects, require a lot of environmental footprint to produce. But my prediction is that over time the vast majority of use cases in the hands of users will be essentially run on device and be basically zero impact, both in monetary cost and environment.
briandear 1 days ago [-]
What’s the net energy footprint of an employee working in an office whose job was made redundant by AI? Of course that human will likely have another job, but what’s the math of a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do. In other words, let’s calculate the “economic output per energy unit expended.”
On that note, what’s the energy footprint of the return to office initiatives that many companies have initiated?
lm28469 1 days ago [-]
> a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do.
Like driving a uber or delivering food on a bicycle ? Amazing!
folkrav 1 days ago [-]
> Of course that human will likely have another job, but what’s the math of a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do
That’s a lot of big assumptions - that the job getting replaced was tedious in the first place, that those other “more productive” job exists, that the thing AI can’t necessarily do will stay that way long enough for it not to be taken over by AI as well, that the tediousness was not part of the point (e.g. art)…
Scarblac 1 days ago [-]
When human civilization crashes due to yearly climate change caused famines it won't matter how useful the work done by the AIs was.
carunenjoyerlp 1 days ago [-]
Net energy change of people doing work on their desk versus browsing the internet versus playing games, you will likely not see difference at all. They're all at rest, more or less thinking something. People at home sofa always have metabolic processes running regardless of whether it produces additional value to some corporation
paulcole 12 hours ago [-]
Are these the same people who claimed that crypto was going to use more energy than the entire world by 2030?
kleiba 7 hours ago [-]
This reminds me of the quote: "Félix L'Herbier learns that there are more links in his brain than atoms in the universe."
Quenby 14 hours ago [-]
With today’s AI systems, we still have very little visibility into their actual energy costs. As we push for larger models and faster responses, it’s worth asking whether we’re unintentionally accelerating energy use and emissions.
Finding the balance between innovation and long-term responsibility feels more important than ever.
liendolucas 6 hours ago [-]
> ...and many buildings use millions of gallons of water (often fresh, potable water) per day in their cooling operations.
This is outrageous. People still struggle to access fresh water (and power), but hey "sustainability is all to our company" is always promoted as if something nice is being done on from the behemoth's sides. BS. What a waste of resources.
I truly condemn all this. To this day I do still refuse to use any of this technology and hope that all this ends in the near future. It's madness. I see this as nothing more than next-gen restrictive lousy search engines, and as many have pointed out ads are going to roll soon. The more people adopt it the worse will be for everyone.
I always emphasize this: 10-15 years ago I could find everything through simple web searches. Everything. In many cases even landing on niche and unexpected but useful and interesting websites. Today that is a difficult/impossible task.
Perhaps there is still room for a well-done traditional search engine (haven't tried Kagi but people in general do say nice things about it) to surface and take the lead but I doubt it, when hype arrives especially in the tech industry people follow blindly. There are still flourishing "ai" startups and from night to day everyone has become a voice or expert on the subject. Again: BS.
Traditional web engines and searches were absolutely just fine and quite impressed with their outputs. I remember it. What the heck has happened?
bogtog 5 hours ago [-]
> ...and many buildings use millions of gallons of water (often fresh, potable water) per day in their cooling operations.
Of note, cooling is water evaporation, so the water will inevitably come back to us as good as new. This contrasts uses that will actually pollute water
acidburnNSA 19 hours ago [-]
I'm glad that the needs of AI and the sustainable capabilities of nuclear fission go well together.
djoldman 16 hours ago [-]
> This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI’s future impact on energy grids and emissions. Worse, the deals that utility companies make with the data centers will likely transfer the costs of the AI revolution to the rest of us, in the form of higher electricity bills.
... So don't? Explicitly shift the cost to the customer.
If I want to hook up to the energy grid with 3-phase power, I pay the utility to do it.
If a business wants more power and it isn't available, then the business can pay for it.
Then only businesses that really need it will be willing to step up to the plate.
No amount of "accounting" or "energy needs prediction" will guard against regulatory capture.
mark_l_watson 1 days ago [-]
The book “AI Atlas” covers the energy and other costs of AI.
It's from 2021 so won't cover the 2022-onwards generative AI boom.
From the Wikipedia summary it sounds like it's about machine learning algorithms like classification, AlphaGo and concerns about ethics of training and bias.
mark_l_watson 3 hours ago [-]
thank you! I got the name wrong. “Atlas of AI” by Kate Crawford.
emushack 1 days ago [-]
Link?
17 hours ago [-]
LordDragonfang 9 hours ago [-]
> The new model uses [energy] equivalent to riding 38 miles on an e-bike... AI companies have defended these numbers saying that generative video has a smaller footprint than the film shoots and travel that go into typical video production. That claim is hard to test and doesn’t account for the surge in video generation that might follow if AI videos become cheap to produce.
"Hard to test", but very obviously true if you make any attempt at guessing based on making a few assumptions... like they seem comfortable doing for all the closed source models they don't have access to being run in conditions they're not testing for. Especially considering they're presenting their numbers as definitive, and then just a couple paragraphs down admit that, yeah, they're just guessing.
Regardless, I know for a fact that a typical commercial shoot uses way more energy than driving across the TMZ in an e-bike (considering they're definitely using cars to transport gear, which gives you less than 4 miles for the same energy).
fred69 18 hours ago [-]
Might have missed it but was disappointed to see no mention of externalized costs like the scraping burden imposed on every IP-connected server. From discussions on HN this sounds quite substantial. And again, why exactly should the few AI companies reap all the value when other companies and individuals are incurring costs for it?
James_K 5 hours ago [-]
The fact that none of these companies want to tell you how much power they're using should be enough to reason that it's utterly horrible. If it were anywhere reasonable, they'd be boasting about how low the number is.
bbor 18 hours ago [-]
Interesting, thanks for sharing! I share some concerns others have about this piece, but I’m most shocked about their finding that image generation is cheaper than text. As someone who’s gone down this rabbit hole multiple times, this runs against every single paper I’ve ever cited on the topic. Anyone know why? Maybe this is a recent change? It also doesn’t help that multimodal transformers are now blurring the lines between image and text, of course… this article doesn’t even handle that though, treating all image models as diffusion models.
Lerc 1 days ago [-]
The point that stood out to me as concerning was
"The carbon intensity of electricity used by data centers was 48% higher than the US average."
I'd be fine with as many data centers as they want if they stimulated production of clean energy to run them.
But that quote links to another article by the same author. Which says
"Notably, the sources for all this power are particularly “dirty.” Since so many data centers are located in coal-producing regions, like Virginia, the “carbon intensity” of the energy they use is 48% higher than the national average. The paper, which was published on arXiv and has not yet been peer-reviewed, found that 95% of data centers in the US are built in places with sources of electricity that are dirtier than the national average. "
"The average carbon intensity of the US data centers in our study (weighted by the energy they consumed) was 548 grams of CO2e per kilowatt hour (kWh), approximately 48% higher than the US national average of 369 gCO2e / kWh (26)."
which shows 375g/KWh (after converting from lb/MWh)
But the table they compare against shows.
VA 576g/KWh
TX 509g/KWh
CA 374g/KWh
and the EPA table shows
VA 268g/KWh
TX 372g/KWh
CA 207g/KWh
Which seem more likely to be true. The paper has California at only marginally better than the national average for renewables (Which I guess they needed to support their argument given the number of data centers there)
I like arxiv, It's a great place to see new ideas, the fields I look at have things that I can test myself to see if the idea actually works. I would not recommend it as a source of truth. Peer review still has a place.
If they were gathering emissions data from states themselves, they should have caclulated the average from that data, not pulled the average from another potentially completely different measure. Then their conclusions would have been valid regardless what weird scaling factor they bought in to their state calculations. The numbers might have been wrong but the proportion would have been accurate, and it is the proportion that is being highlighted.
GuinansEyebrows 18 hours ago [-]
there are still negative externalities to high renewable-energy usage (heat and water usage, which itself requires energy to purify once returned to the sewer, plus the environmental impact of building an enormous heat island in places where there was little industry previously).
tempfile 7 hours ago [-]
Jesus, who writes this stuff?
> AI is unavoidable
> We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode
This is surely meant to be an objective assessment, not a fluff piece.
mitch_said 1 days ago [-]
[dead]
gitroom 15 hours ago [-]
[dead]
sfpityparty 19 hours ago [-]
[flagged]
Rendered at 15:51:21 GMT+0000 (Coordinated Universal Time) with Vercel.
Erm ... that's a weird date considering this article came out yesterday. They actually pledge to triple the world's nuclear capacity by 2050[1]
There are a couple of weird things like that in this article, including the classic reference to "experts" for some of its data points. Still ... at least somebody's trying to quantify this.
[1] https://www.world-nuclear-news.org/articles/amazon-google-me...
Very quickly skimming, I have some trouble taking this post seriously when it omits that the larger DeepSeek one is a mixture-of-experts that will only use 12.5% (iirc) of its components for each token.
The best summary of text energy use I've seen is this (seemingly more rigorous, although its estimates are consistent with the final numbers made by the present post): epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
Estimates for a given response widely for a "typical" query (0.3 Wh; 1080 joules) and a maximal-context query (40 Wh; 144k joules). Assuming most uses don't come close to maximizing the context, the energy use of text seems very small compared to the benefits. That being said, the energy use for video generation seems substantial
I would be interested in seeing the numbers corresponding to how LLMs are typically used for code generation
This program posts news to thousands of machines throughout the entire civilized world. Your message will cost the net hundreds if not thousands of dollars to send everywhere. Please be sure you know what you are doing. Are you absolutely sure that you want to do this? [ny]
Maybe we meed something similar in LLM clients. Could be phrased in terms of how many pounds of atmospheric carbon the request will produce.
So they used to send this message, but then it stopped I assume. Costs lowered a lot or the benefits outweighed all associated costs. Same can happen here.
How is this even quantifiable?
How about this. Before using AI to make fake images and help kids cheat on their homework, we take it offline and use it to solve it's own problem of energy use.
You know what this does not happen? Because the goal is profit and the profit comes not from solving real important problem, but by making people think it is helping them solve made up problems.
Llama 3.1: ~0.7 to 1.5 grams of CO2
Rolling coal event: ~10,000 to 100,000+ grams of CO2
Yes.
You underestimate the pervasiveness of AI, and in particular, ChatGPT. It is quite popular in the blue collar trades.
And yeah, a lot of them probably regard everything that ChatGPT tells them as fact.
"Rolling coal" is the practice of modifying a diesel engine—usually in pickup trucks—to increase the amount of fuel entering the engine, which causes the vehicle to emit large, thick plumes of black smoke from the exhaust. This is often done by tampering with or removing emissions control devices.
This is one of the more hysterical things I have heard.
How would we even know it was translating animal language correctly?
You can see traffic. It's easy to understand the dangers in a collision because when you drive into something unexpectedly your body takes a hit and you get frightened since you immediately realise that it might cost you a lot of money but you don't know for sure.
Being subtly manipulated by a disgustingly subservient fake conversationalist is another thing altogether.
If memory serves Jet A is not taxed at all federally in the case of for-profit corporations (while non-commercial users DO pay a tax!) and many states also either do not tax it or tax it very litte.
It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
Joe driving to work spends a larger fraction of his income on fuel and thus fuel tax than his rich counterpart.
This is true for all "let the consumer/poluter pay" taxes, they're all regressive. They say: it's fine to burn up the world as long as you're rich.
Personally I like the idea of setting the price for emitting 1 ton of CO2 equivalent emissions to the realistic cost of capturing 1 ton of CO2. At least, that seems like a reasonable end goal for a carbon tax, since that could fully account for the negative externality. This would of course be obscenely expensive, which would be a strong incentive to lower carbon emissions where possible, and for people to consume less of products that require large emissions to make or use.
The carbon tax would also have to apply to imported goods to be effective, and tracking how much tax should apply to imports would be even more difficult than doing so for domestic sources of pollution.
Please see my comment again. Under a revenue-neutral carbon tax everyone gets money back. But they don't realize it. Costs only go up for people who emit more carbon than average.
Exhibit A: Canada's recent repudiation of their carbon tax because nobody knew they were getting rebates. https://www.cbc.ca/news/politics/carbon-tax-rebate-rebrand-1...
> Talk to some outside of your activist bubble
That's quite condescending btw. Is it "activism" to try to avert a calamity that will increase the cost of living by a lot more 20 years from now? I think it's good fiscal sense. Long-term thinking and planning. Y'know adult shit.
> Humans are, on average, selfish beings
And easily swayed by stupid arguments. Exhibit B: Canada's recent repudiation of the carbon tax because fossil fuel industry propaganda convinced everyone that the tax was the cause of price increases. Now prices will stay the same (because the market will bear them) but no one will get any rebate money.
Eh. It's not Bill Gates and Alice Walton. Sometimes the obvious answer is the real one: It's the fossil fuel industry.
> It's completely insane that we do not tax fuel usage for probably the most energy-intensive way to move people and/or goods and often that movement of people is entirely frivelous.
That one's just the arbitrage problem. Planes move around. If there is an international flight to a country that doesn't tax jet fuel (or taxes it less) then the plane is going to fly into LAX with enough fuel still in the tank to get back to the other jurisdiction and fill up again. Which actually increases fuel consumption because fuel is heavy and they otherwise wouldn't want to do that.
This is the same reason the EU doesn't tax jet fuel.
Any reason that can't be treated as a fuel import and taxed accordingly? I understand current laws may not allow it but is that legislation impossible to write?
Even flying would only cost about 10% more for example. And most other activities have carbon free alternatives they can shift to rather than just eat the cost. Which is kind of the point.
They've been implemented all over the world, because they're effective. They cover 31% of emissions in developed nations.
To whatever degree you could say they are unpopular, they're unpopular in regions where the government doing stuff about climate change (or just "the government doing stuff") is unpopular, which makes it odd to single out putting a price on carbon specifically
See where they are used here: https://carbonpricingdashboard.worldbank.org/
Are solar panels convenient? All polysilicon today is made with fossil fuels, and the R&D to make it with renewable energy is still in-progress. Not to mention that we ship them across the ocean with fossil fuel.
Even driving your car around you at least are somewhat aware of the gas you are burning.
Emissions should be fixed on the production side (decarbonization) not on the demand side (guilt/austerity).
Scaling up battery production makes EVs more appealing on the demand side. How do you disincentivize fossil fuel production?
The problem with fossil fuels isn’t that they pollute, but that most of the negative impact of that pollution is borne by others. This results in an artificially low price which distorts the market and results in economically inefficient overuse.
Capture that cost by making producers pay a tax of the corresponding amount, and market forces will use the “right” amount of fossil fuels in ways that are a net benefit.
Or, even worse God forbid, think about how much carbon is produced to create a single bottle or carton of water. Then consider how casually people down bottles of water.
Funny how this suddenly became a thing after electrification became a thing. Need to find a new way to wag the finger after all.
The oil industry is a conglomerate of degenerates spamming boomer logic all the way down to the workers. Their memes propagate throughout society and lead to the other boomer characteristic of rewriting personal and societal history.
The finger waggers now are being programmed to pretend they talked about tire particulates and the carheads are being programmed to pretend they never cared about 0-60. This another "We have always been at war with Eastasia", just like they all opposed the Iraq war from day 1 and didn't cancel the Dixie Chicks, et cetra.
This may have been discussed in specialist literature somewhere but even when I did ecology courses in university circa 2001ish, I never heard about tire particulates, while I did hear a lot about greenhouse gasses.
It's a concern but not a civilization ending concern like climate change. I low key resent these attempts to move goalposts to satisfy the writer's urge for negativity.
Consider that a bus has six to ten tires that each weigh around ten times more than a typical car tire. This is presented as the alternative to cars, is it even any different? Not implausible that it could actually be worse, especially if the bus isn't at full occupancy at all times.
Meanwhile the weight difference between EVs and petroleum cars is emphasized in the complaints, even though it isn't very large, while the much larger weight difference between any cars and buses is ignored. Because the point isn't to complain about tires, it's to complain about EVs.
And if the point actually was to complain about tires then you still wouldn't be talking about EVs, you would be talking about tires and how to make them shed less or construct them out of lower toxicity materials etc.
My city buses in peak travel hours have anywhere from 20 to 75 people on them. Even if we assume that every one of those folks would have carpooled (which rarely happens), we're looking at a lot of cars, and thus tires, on the road.
To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity. Volume lost per mile driven is basic high-school arithmetic, and the only real questions are regarding data quality and whether the self-imposed constraints (e.g., examining real-world wear rather than wear given optimal driving or something) are reasonable.
Harder rubber seems like it could make a difference, but then you could also put tires with harder rubber on a car.
You can get a heavier vehicle to have the same pressure at the road by using more and bigger tires, but then the problem is that the tires are bigger and there are more of them.
> plus its normal driving patterns have less wear than typical Tesla use.
Isn't a city bus constantly starting and stopping, both as a result of city traffic and picking up and dropping off passengers?
> To make those sorts of calculations easy, you can ignore all the pressure/usage/etc nonsense and just do basic math on tire dimensions (including min/max tread depth and width, not just radius, though I typically ignore siping and whatnot) and typical longevity.
I tried plugging these in and it still comes out as a 6-wheel commercial bus has several times the tire wear as a 4-wheel light truck, rather than being the same.
And I expected the difference to be even more, but I guess that goes to show how much the weight argument is motivated reasoning if ~7x the weight is only ~3x the tire wear and then people are complaining about something which is only ~1.2x the weight.
Pardon me if I ask the obvious question, but did you divide your result by the average number of people moved? Because that's the actual utility of mass vs. individual transport. I would find it rather surprising if tire wear was the one measure were buses didn't win out.
No bus tires to not typically last 500k miles. <100k is the norm, and really not more than a long-life car tire.
They do get retreaded more often than car tires do, but that just means they get new rubber added regularly.
Tire temperature also will play a big roll in tire wear, and I wouldn't expect bus tires to get very hot only rolling half the time and at a lower speed than the typical car.
And of course you also gotta factor in passenger count. Buses generally have more than just 1 or 2 people, while the vast majority of cars will have 1 or 2 people most of the time. And even if a bus tires were to wear out twice as fast as a car's tire, that is still less wear per person than a car.
Is there a way to quantify this? My experience as well is that the tire particulate pollution has mostly been an anti-EV talking point.
The biggest problem with tailpipe emissions used to be horrendous smog. That was mostly solved in many places, and now the biggest problem is the impact on the global climate.
The biggest issue with childhood mortality used to be disease. Now we (correctly) focus more on accidental deaths.
EVs solved tailpipe emissions, but they’re not perfect. Their biggest problem is just something else.
There’s been decades of lies about climate change. And once the truth got out society was already massively dependent on it. For cars specifically it was a deliberate policy to make e.g. the US car-dependent. And once the truth got undeniable the cope was switched to people’s “carbon footprint” (British Petroleum). In fact there are rumors that the cope echoes to this day.
Zoom out enough and it becomes obviously unproductive to make “mass ignorance” the locus of attention.
Do you really think the average person could within 2 orders of magnitude when estimating their carbon footprint for a year?
Why? AI isn't a human being. We have no obligation to be "fair" to it.
Political lobbying.
I can picture an Elizabeth Holmesian cartoon clutching her diamond necklace.
"Oh, won't somebody think of the tech billionaires?!"
If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet
The basic premise of the modern tech industry is scale. It's not one person running a microwave for a couple of seconds, it's a few billion people running a microwave for the equivalent of decades.
The only purpose is to scapegoat the possible environmental or economic fallout. Might as well put it on individuals. Like what’s always done.
I’ve already seen it on the national broadcast. There some supposed experts were wagging their fingers about using AI for “fun”. Making silly images.
Meanwhile we’re gonna put AI to good use in arms races: more spam (automated applications, ads, ads, ads, abuse of services) and anti-spam. There’s gonna be so much economic activity. Disruptive.
Likewise, I doubt that USENET warning was ever true beyond the first few years of the networks' lifetime. Certainly if everything was connected via dial-up, yes, a single message could incur hundreds of dollars of cost when you added the few seconds of line time it took to send up across the whole world. But that's accounting for a lot of Ma Bell markup. Most connections between sites and ISPs on USENET were done through private lines that ran at far faster speeds than what you could shove down copper phone wiring back then.
The article uses open source models to infer cost, because those are the only models you can measure since the organizations that manage them don't share that info. Here's what the article says:
> The largest of our text-generation cohort, Llama 3.1 405B, [...] needed 3,353 joules, or an estimated 6,706 joules total, for each response. That’s enough to carry a person about 400 feet on an e-bike or run the microwave for eight seconds.
I just looked at the last chat conversation I had with an LLM. I got nine responses, about the equivalent of melting the cheese on my burrito if I'm in a rush (ignoring that I'd be turning the microwave on and off over the course of a few hours, making an awful burrito).
How many burritos is that if you multiply it by the number of people who have a similar chat with an LLM every day?
Now that I'm hungry, I just want to agree that LLMs and other client-facing models aren't the only ML workload and aren't even the most relevant ones. As you say adtech has been using classifiers, vector engines, etc. since (anecdotally) as early as 2007. Investing algorithms are another huge one.
Regarding your USENET point, yeah. I remember in 2000 some famous Linux guy freaking out that members of Linuxcare's sales team had a 5 line signature in their emails instead of the RFC-recommended 3 lines because it was wasting the internet or something. It's hard for me to imagine what things were like back then.
Are you saying all of that new capacity is needed to power non-LLM stuff like classifiers, adtech, etc? That seems unlikely.
Had you said that inference costs are tiny compared to the upfront cost of training the base model, I might have believed it. But even that isn't accurate -- there's a big upfront energy cost to train a model, but once it becomes popular like GPT-4, the inference energy cost over time is dramatically higher than the upfront training cost.
You mentioned batch computing as well, but how does that fit into the picture? I don't see how batching would reduce energy use. Does "doing lots of work at once" somehow reduce the total work / total energy expended?
Well, partly because they (all but X, IIRC) have commitments to shift to carbon-neutral energy.
But also, from the article:
> ChatGPT is now estimated to be the fifth-most visited website in the world
That's ChatGPT today. They're looking ahead to 100x-ing (or 1,000,000x-ing) the usage as AI replaces more and more existing work.
I can run Llama 3 on my laptop, and we can measure the energy usage of my laptop--it maxes out at around 0.1 toasters. o3 is presumably a bit more energy intensive, but the reason it's using a lot of power is the >100MM daily users, not that a single user uses a lot of energy for a simple chat.
This seems like a classic tragedy of the commons, no? An individual has a minor impact, but the rationale switching to LLM tools by the collective will likely have a massive impact.
Something to temper this, lots of these AI datacenter projects are being cancelled or put on hiatus because the demand isnt there.
But if someone wants to build a nuke reactor to power their datacenter, awesome. No downsides? We are concerned about energy consumption only because of its impact on the earth in terms of carbon footprint. If its nuclear, the problem has already been solved.
Wait, any sources for that? Because everywhere I go, there seems to be this hype for more AI data centers. Some fresh air would be nice.
AI seems like it is speedrunning all the phases of the hype cycle.
"TD Cowen analysts Michael Elias, Cooper Belanger, and Gregory Williams wrote in the latest research note: “We continue to believe the lease cancellations and deferrals of capacity points to data center oversupply relative to its current demand forecast.”"
If you want to know more about energy consumption, see this 2 part series that goes into tons of nitty-gritty details: https://blog.giovanh.com/blog/2024/08/18/is-ai-eating-all-th...
You could do it better than we are doing now, but you'll always have people saying: "that's unfair, why are you picking on me"
Mind you people won't like that since we're so used to using the atmosphere as a free sewer. The idea of having to pay for our pollution isn't palatable since the gasses are mostly invisible.
Though it's sad that we're talking about market solutions rather than outright bans for the majority of applications like we did for leaded gas.
Meanwhile the people with a 10 year old car they drive 5000 miles a year will keep it until it's a 20 year old car, at which point they'll buy another 10 year old car, but by then that one will run on electricity.
Then you could theoretically ban it, but by then do you even need to?
You don't have to ban existing cars, they will phase themselves out. Give every X years and ban the sales of any non-hybrids for all but a few niche applications. Then in X+Y years ban all combustion engines other than niche applications.
But ultimately, we need to be serious about this, and half the population and the governments of most western countries are not serious. Many people still believe that climate change is a hoax, and ridiculous ideas like hydrogen cars and ammonia burning ships are still getting funding.
Maybe that says the fees aren't yet high enough for high income people to change behavior, but I'm willing to bet they never truly will be due to the influence this subset of the population holds over politics.
Even if rich people don’t consume much more energy than poor people (I have no idea, just engaging with your idea as stated), they must be buying something with their money… carbon taxes should raise the price of goods with lots of embodied carbon.
If they aren’t consuming much energy and am they aren’t buying stuff with much embodied carbon… I dunno, I guess that’s the goal, right?
As for the stove, how much it uses is directly related to the kind of cooking you do, and for how long.
Even with things like orphaned natural gas that gets flared otherwise - rescuing the energy is great but we could use it for many things, not just LLMs or bitcoin mining!
If you would have built 10GW of solar or nuclear to replace other generation and instead the data center operators provide funding to build 20GW so that 10GW can go to data centers, the alternative wasn't replacing any of the other dirty generation. And the economies of scale may give the non-carbon alternatives a better cost advantage so you can build even more.
Rivers gotta run, suns gotta shine, winds gotta blow.
Increasing demand can lead to stimulus of green energy production.
Making electricity so abundant and efficient is probably more solvable. You can’t solve stu… society
(cgroups, as per a sibbling comment, are addressed in this write-up as "not maximally satisfying")
I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.
The low hanging fruit has been plucked by said silicon development process and while remarkable improvement in AI efficiency is likely it is highly unlikely for that to follow a similar curve.
More likely is slow, incremental process taking decades. We cannot just wish away billions of parameters and the need for trillions of operations. It’s not like we have some open path of possible improvement like with silicon. We walked that path already.
Maybe photonics..
I can imagine that doing some clever offloading to a normal programs and using the LLM as a sort of "fuzzy glue" for the rest could improve the efficiency on many common tasks.
Impressive how Big Tech refuses to share data with society for collective decisions.
I'd also recommend the Data Vampires podcast series:
https://techwontsave.us/episode/241_data_vampires_going_hype...
https://techwontsave.us/episode/243_data_vampires_opposing_d...
https://techwontsave.us/episode/245_data_vampires_sacrificin...
https://techwontsave.us/episode/247_data_vampires_fighting_f...
I guess it becomes okay when the companies guzzling the energy are some of the biggest tech employers in the world, buttering your bread in some way.
Whether the cost/benefit works out in the case of AI is another question.
On one hand, the cost of compute per token has gone down a lot, and will continue to go down, because that's exactly the economic incentives at play. We had a little short-term nonsense where "the bigger the better" was all the rage, but inference was never this way, and now training is also pushing in this direction.
But on the other hand, less compute per token means it can be more broadly deployed. And so there is likely more energy use, not less, in the long run.
Companies like Apple and Google are both building data centers and trying to make on-device AI a thing. Unfortunately, they also keep inventing new, more expensive algorithms.
It’s at least plausible that most LLM use will become cheap enough to run on battery-limited devices like laptops and phones, though it’s not what most people are betting on.
[1] https://epoch.ai/data-insights/llm-inference-price-trends
VR is back in its niche.
3DTV .. I've not seen that marketed for quite a while now. Things that are fads will die out eventually.
Crypto meanwhile is in an odd space: everyone knows blockchains are too much of a hassle and that the volatility is too high, so they're using centralized exchanges and "stablecoins" (shadow dollars). There's still a huge amount of money there but not quite as much as the stadium ads / FTX peak.
- "by 2028 [...] AI alone could consume as much electricity annually as 22% of all US households."
What would the 22% be if compared against all US energy instead of just all US household?
[1] https://rpsc.energy.gov/energy-data-facts
> In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.
As we all know, the generative AI boom only really kicked into high gear in November 2022 with ChatGPT. That's five years of "AI" growth between 2017 and 2022 which presumably was mostly not generative AI.
Nearly any other daily activity of a consumer in the developed world uses orders of magnitude more energy and resources than scrolling TikTok on a phone.
Examples?
– Driving to work: commuting burns far more fuel in a week than your phone uses in a year.
– Gym sessions: heated, lit, air-conditioned spaces plus transit add up quickly.
– Gaming or watching TV: bigger screens, bigger compute easily 100x and higher power needs vs phone gaming.
– Casually cooking at home: using a metric ton of appliances (oven, stove, fridge, pans) powered like twice a week, replaced every ~10 years.
– Reading print media: a daily newspaper or weekly book involves pulp, ink, shipping, and disposal.
– Streaming on a laptop or smart TV: even this draws more power than your phone.
– Taking a shower: the hot water energy use alone dwarfs your daily phone charge.
Of couse not doing any sports or culture is also not what societies want, but energy wise a sedentary passive tiktok lifestyle is as eco friendly as it get's vs. any other real world example.
Phones are basically the least resource-intensive tool we use regularly. Externalities, context, and limited human time effects matter a lot more than what one phone uses vs the other.
Even e-readers already break even with books after 36 paper equivalents
https://www.npr.org/2024/05/25/1252930557/book-e-reader-kind...
If you don't want to go there, it doesn't really matter how much energy the human uses because the human will just use the same energy to do something else.
Human's got to exist and needs to work to eat. They don't really, necessarily, existentially need to be 10x productive with the help of AI.
But I'll be honest, that's not really a solid argument, because it could rapidly lead to the question of why they do this exact job in the first place, instead of e.g. farming or whatever else there might be that can be called a net positive for humanity without reservations.
Environmental, social, and governance (ESG) is shorthand for an investing principle that prioritizes environmental issues, social issues, and corporate governance.
I found this article to be a little too one sided. For instance, it didn’t talk about the 10x reductions in power achieved this past year — essentially how gpt4 can now run on a laptop.
Viz, via sama “The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger.” https://blog.samaltman.com/three-observations
He’s one of the most knowledgeable persons in the world on the topic. It’s like saying that the ceo of BMW isn’t citable on conversations about expected cost decreases in electric cars.
Any environmental topic is susceptible to a huge amount of groupthink. The world isn’t so binary as people make it out to be. It is far from truth that LLMs=bad for environment, any more than computers=bad for the environment.
Altman mentions a 150x increase in efficiency, you claim that trend will continue through to gpt6. At that point these models would be 22500 as efficient as they currently are, which would mean generating a 10 hour long video would cost around the same amount of electricity as running your microwave for 15 minutes. Will you have some introspection if that doesn't come to pass?
What I mean is that I have a healthy level of skepticism with Altman. He has to constantly battle for funding. Surely, he must be knowledgeable about LLMs, but he's the CEO of the largest AI company in the world, his PR needs to give "most knowledgeable person in the world on the topic" but I think that title should go to all of the engineers and developers working on these technologies and not a capital founder.
All that said, I agree that LLM's being bad for the environment is a complex topic. I think it would be more accepted if people had safety nets and could be excited for AI to take their job instead of having to be terrified, or if AI isn't just used as another tool for increasing wealth inequality.
1/ did not exist before
2/ does not replace/reduce previous/other power (some, very much more critical and essential) usages.
3/ a LOT of tasks are still way more energy/time-efficiently done with regular existing methods (dedicated software, or even by hand), but are still asked/improperly routed to AI chatbots that ... statistically guess the answer.
It also leads to automation and efficiency, even if it isn’t a fully linear path.
AI isnt a waste. We can’t let environmental consciousness get in the way of rather natural human development. Especially CO2. (I have different opinions about biodiversity because of the irreversible loss. I also believe that we have the technology to pause and reverse climate change — but don’t pursue it because of degrowth ideologies)
Give some concrete examples and stats?
> It also leads to automation and efficiency, even if it isn’t a fully linear path.
Ditto.
For _who_ really?
Shareholders of AI tools producing companies?
Shareholders of companies that pretend to replace people and verifiable working processes with poorly understood black boxes?
(I can't help but notice the _same_ playbook as with crypto, NFTs, Web3, metaverse, and the same enabling-hardware provider).
Value, automation, efficiency will not solve the climate change challenges, if they are not directed towards it aggressively, as well as humanity acceptance and well-being.
Alas, they are directed towards a very little few bank accounts. Violence and subjugation, in many of their forms, is directed towards the others. It's not by accident.
While “economic value” =/= “solving climate change” without enough tax revenue costly transitions are impossible.
It's revealing that for some, it's easier to imagine Earth without life than Earth without capitalism.
I wouldn't be surprised if mankind will evolve similar to an organism and use 20% of all energy it produces on AI. Which is about 10x of what we use for software at the moment.
But then more AI also means more physical activity. When robots drive cars, we will have more cars driving around. When robots build houses, we will have more houses being built, etc. So energy usage will probably go up exponentially.
At the moment, the sun sends more energy to earth in an hour than humans use in a year. So the sun alone will be able to power this for the foreseeable future.
The main complaint about energy usage is it will damage the environment, which will (indirectly) reduce quality of life.
It's a question of which factor wins.
I don't think there will be much carbon intensive energy creation in a few decades from now. It does not make sense economically.
Anyway I hope you're right, but so far global CO2 output is still growing. All the other energy has only come on top of carbon intensive energy, it hasn't replaced any of it. Every time we build more, we find new ways of spending that much energy and more.
I remember how me and my friends discovered email in 1999 and were like "Yay, in the future we'll all do this instead of sending letters!". And it took about 20 years until letters were largely replaced by email and the web. And when the first videos appeared on the web, it was quite clear to us that they would replace DVDs.
Similar with the advent of self driving cars and solar energy I think.
> The carbon intensity of electricity used by data centers was 48% higher than the US average.
If the people always talking about how cheap solar is want to fix this, find a way to make that cheapness actually make it into the customer's electric bill.
There are periodically news articles and such about data centers in Iceland, of course, but I get the impression it's mostly a fad, and the real build-outs are still in Northern Virginia as they've always been.
The typical answer I've seen is that Internet access and low latency matter more than cooling and power, but LLMs seem like they wouldn't care about that. I mean, you're literally interacting with them over text, and there's already plenty of latency - a few extra ms shouldn't matter?
I'd assume construction costs and costs of shipping in equipment also play a role, but Iceland and Canada aren't that far away.
kindof sounds like Jevons paradox? https://wiki.froth.zone/wiki/Jevons_paradox
This doesn't seem true. In SF, waymo with 300 cars does more rides than lyft with 45k drivers. If self driving cars interleave different tasks based on their routes I imagine they would be much more efficient per mile.
Seems like we are way too early in the adoption curve to tell. Currently the average number of passengers per trip is >1.0 across the whole fleet. Some day, I'd expect that to dip below 1.0, as people send an empty car to pick up the dog from the vet, or circle the block to avoid having to pay for parking, etc.
> With more than 700 vehicles in its fleet - 300 of which operate in San Francisco - Waymo is the only U.S. firm that runs uncrewed robotaxis that collect fares.
Those numbers are from April 2025.
https://waymo.com/blog/2025/05/scaling-our-fleet-through-us-...
A car driving from A to B will cost less than 50% of the current price. Which will unlock a huge amount of new rides.
Not sure about the exact numbers, but I guess that at the moment normal roofs and solar panels absorb very roughly about the same percentage of sunlight.
So if in the future solar panels become more efficient, then yes, the amount of sunlight turned into heat could double.
Maybe that can be offset by covering other parts of earth with reflective materials or finding a way to send the heat back into the universe more effectively.
And also, people should paint their roofs white.
Some could, but a dark roof can be beneficial in the winter
GAFAM nuclear are mere announcements, intentions.
On the other front most already progress. https://www.reuters.com/sustainability/climate-energy/micros...
https://www.reuters.com/sustainability/climate-energy/micros...
The discussion about nuclear vs solar remind me of the discussions about spinning HDs versus solid state drives when they were new.
Yes, much of what is being promoted is slop. Yes, this bubble is driven by an overly financialized economy. That doesn't preclude the possibility of AI models precipitating meaningful advancements in the human condition.
From refrigeration to transportation, cheap and abundant energy has been one of the major driving forces in human advancement. Paradoxically, consuming cheap energy doesn't reduce the amount of energy available on the market. Instead it increases the size of the market.
A few big consumers in centralized locations isn't changing the grid as much as the energy transition from fuels to electricity is
Table A1 , PDF page 29:
https://www.epri.com/research/products/000000003002028905
(P.S. check your spelling!)
> But as more of us turn to AI tools, these impacts start to add up. And increasingly, you don’t need to go looking to use AI: It’s being integrated into every corner of our digital lives.
Forward looking, I imagine this will be the biggest factor in increasing energy demands for AI: companies shoving it into products that nobody wants or needs.
I think the bigger underrated concern is if LLMs fall into an unfortunate bucket where they are in fact generally useful, but not in ways that help us decarbonize our energy supply (or that do, but not enough to offset their own energy usage).
So yes, I'd like to disable this completely. Even if it's just a single birthday candle worth of energy usage.
I think of this a little every time Google gives me another result with the AI summary and no option for anyone to turn it off. Apparently worldwide there are 8+ billion searches every day.
I’m sorry. I’m being blocked by some mysterious force from understanding what “actual human” means. And I don’t know how to get you in contact with your car manufacturer. Would you like me to repeat my 20 step suggestion on how to troubleshoot “why does my shitty car put the A/C on freezer mode whenever “Indian Summer” tops the charts in Bulgaria”, but with more festive emojis?
But everyone knows fuel economy is everything but a knowable value. Everything from if it has rained in the past four hours to temperature to loading of the vehicle to the chemical composition of the fuel (HVO vs traditional), how worn are your tires? Are they installed the right way? Are your brakes lagging? The possibilities are endless. You could end up with twice the consumption.
By the way, copy-pasting from the website is terrible on desktop firefox, the site just lags every second, for a second.
Not sure about comprehensive claim here if end-to-end query chains were not considered.
For example the mobile wireless node (that're being used by the majority of the users) contribution to the energy consumption are totally ignored. The wireless power amplifier or PA for both sides of users and base-stations are notorious for their inefficiency being only less than than 50% in practice although in theory can be around 80%. Almost all of the current AI applications are cloud based not local-first thus the end users energy consumption and contribution are necessary.
> There is a significant caveat to this math. These numbers cannot serve as a proxy for how much energy is required to power something like ChatGPT 4o.
Otherwise this is an excellent article critiquing the very real problem that is opacity of these companies regarding model sizes and deployments. Not having an honest accounting of computing deployed worldwide is a problem, and while it's true that we didn't really do this in the past (early versions of Google searches were undoubtedly inefficient!), it's not an excuse today.
I also wish this article talked about the compute trends. That is, compute per token is going significantly down, but that also means use of that compute can spread more. Where does that lead us?
Which I already thought was odd, because London would need all that electricity to see through the giant mountain of poop piled up by all the horses the british use for transportation.
Even were it matches sorta (the 400 feet e-bike thing) that only works out for me because I use an AMD card. An NVIDIA card can have several times the generation speed at the same power draw so it all falls down again.
And the parameters they tried to standardize their figures with (the 1024x1024 thing) is also a bit meh because the SAME amount of pixels in a different aspect ratio can have huge variations in gen speed and thus power usage. for instance for most illustrious type checkpoints the speed is about 60% higher at aspect ratios other than 1024x1024. Its all a bit of a mess.
https://par.nsf.gov/servlets/purl/10101126
https://dgtlinfra.com/data-center-water-usage/
https://datacenters.microsoft.com/sustainability/efficiency/
Oil might be able to carry more heat but it's more expensive to use.
Oil immersion is something nerds like to think is amazing but it's just a pain in the ass for negligible benefits. Imagine the annoyance of doing maintenance.
Build more nuclear, build more solar. Tax carbon.
I doubt this is going to change.
That said, the flip side of energy cost being not a big factor is that you could probably eat the increase of energy cost by a factor of say 2 and this could possibly enable installation of short term (say 12h) battery storage to enable you to use only intermittent clean energy AND drive 100% utilization.
The racks I am personally responsible for consume 17.2kW. That’s consistent across the year; sure things dip a bit when applications are shut down, but in general 17.2kW is the number. Presuming a national average of 1.2kW per home, each rack of equipment I oversee could potentially power 14 houses. I am responsible for hundreds of these racks, while my larger organization has many thousands of these racks in many locations worldwide.
I’ve found no other way to let the scale of this sink in. When put this way she is very clear: the price isn’t worth it to humanity. Being able to get, say, Door Dash, is pretty neat! But not at the cost of all our hoarded treasure and certainly not at the cost of the environment on the only planet we have access to.
The work done by AI will only ever benefit the people at the top. Because to be frank: they won’t share. Because the very wealthy have hoarding disorder.
Unless your racks can only serve 14 customers.
Each!
We are in no meaningful sense torching the biosphere to get AI.
This is condescending and rude. It also strikes me as obviously wrong. The 'response' isn't emotional in the slightest; it just explains the emotional and cognitive experience of acquiring understanding. It's a reasonable, well-reasoned explanation of the difficulty of intuitively grasping how much energy these data centers, and thus the companies that use them, consume -- and then of the shock many experience when it dawns on them.
> For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).
> Each!
> We are in no meaningful sense torching the biosphere to get AI.
What exactly is the reasoning here? That airplanes use a lot of energy, just like data centers or compared to data centers, and therefore that AI isn't ecologically damaging -- or isn't "meaningfully" damaging, whatever that means? That's not just wrong. It's a nonsequitur.
There's a simpler, easier, better way to wrap our heads around the data, one that doesn't require false dichotomies (or something like that): humans are torching the biosphere both to get AI and to travel.
No. It is not "just like data centers". That is my point. The amount of energy used by transportation is several orders of magnitude more than the energy used by information technologies. The energy used to fly back and forth to Nevada to write this whiny article was more than was needed to train some of the latest models. The whole topic is totally nonsense.
That said, your response was emotional as well: as if your words were cast down from on high, a gift from the gods! Your arrogance and rudeness should inspire some self-examination, but I am not hopeful on that front.
Training SOTA models will, like steel mills or other large industrial projects, require a lot of environmental footprint to produce. But my prediction is that over time the vast majority of use cases in the hands of users will be essentially run on device and be basically zero impact, both in monetary cost and environment.
On that note, what’s the energy footprint of the return to office initiatives that many companies have initiated?
Like driving a uber or delivering food on a bicycle ? Amazing!
That’s a lot of big assumptions - that the job getting replaced was tedious in the first place, that those other “more productive” job exists, that the thing AI can’t necessarily do will stay that way long enough for it not to be taken over by AI as well, that the tediousness was not part of the point (e.g. art)…
Finding the balance between innovation and long-term responsibility feels more important than ever.
This is outrageous. People still struggle to access fresh water (and power), but hey "sustainability is all to our company" is always promoted as if something nice is being done on from the behemoth's sides. BS. What a waste of resources.
I truly condemn all this. To this day I do still refuse to use any of this technology and hope that all this ends in the near future. It's madness. I see this as nothing more than next-gen restrictive lousy search engines, and as many have pointed out ads are going to roll soon. The more people adopt it the worse will be for everyone.
I always emphasize this: 10-15 years ago I could find everything through simple web searches. Everything. In many cases even landing on niche and unexpected but useful and interesting websites. Today that is a difficult/impossible task.
Perhaps there is still room for a well-done traditional search engine (haven't tried Kagi but people in general do say nice things about it) to surface and take the lead but I doubt it, when hype arrives especially in the tech industry people follow blindly. There are still flourishing "ai" startups and from night to day everyone has become a voice or expert on the subject. Again: BS.
Traditional web engines and searches were absolutely just fine and quite impressed with their outputs. I remember it. What the heck has happened?
Of note, cooling is water evaporation, so the water will inevitably come back to us as good as new. This contrasts uses that will actually pollute water
... So don't? Explicitly shift the cost to the customer.
If I want to hook up to the energy grid with 3-phase power, I pay the utility to do it.
If a business wants more power and it isn't available, then the business can pay for it.
Then only businesses that really need it will be willing to step up to the plate.
No amount of "accounting" or "energy needs prediction" will guard against regulatory capture.
It's from 2021 so won't cover the 2022-onwards generative AI boom.
From the Wikipedia summary it sounds like it's about machine learning algorithms like classification, AlphaGo and concerns about ethics of training and bias.
"Hard to test", but very obviously true if you make any attempt at guessing based on making a few assumptions... like they seem comfortable doing for all the closed source models they don't have access to being run in conditions they're not testing for. Especially considering they're presenting their numbers as definitive, and then just a couple paragraphs down admit that, yeah, they're just guessing.
Regardless, I know for a fact that a typical commercial shoot uses way more energy than driving across the TMZ in an e-bike (considering they're definitely using cars to transport gear, which gives you less than 4 miles for the same energy).
"The carbon intensity of electricity used by data centers was 48% higher than the US average."
I'd be fine with as many data centers as they want if they stimulated production of clean energy to run them.
But that quote links to another article by the same author. Which says
"Notably, the sources for all this power are particularly “dirty.” Since so many data centers are located in coal-producing regions, like Virginia, the “carbon intensity” of the energy they use is 48% higher than the national average. The paper, which was published on arXiv and has not yet been peer-reviewed, found that 95% of data centers in the US are built in places with sources of electricity that are dirtier than the national average. "
Which in turn links to https://arxiv.org/abs/2411.09786
Which puts the bulk of that 48% higher claim on
"The average carbon intensity of the US data centers in our study (weighted by the energy they consumed) was 548 grams of CO2e per kilowatt hour (kWh), approximately 48% higher than the US national average of 369 gCO2e / kWh (26)."
Which points to https://ourworldindata.org/grapher/carbon-intensity-electric...
For the average of 369g/KWh. That's close enough to the figure in the table at https://www.epa.gov/system/files/documents/2024-01/egrid2022...
which shows 375g/KWh (after converting from lb/MWh)
But the table they compare against shows.
and the EPA table shows Which seem more likely to be true. The paper has California at only marginally better than the national average for renewables (Which I guess they needed to support their argument given the number of data centers there)I like arxiv, It's a great place to see new ideas, the fields I look at have things that I can test myself to see if the idea actually works. I would not recommend it as a source of truth. Peer review still has a place.
If they were gathering emissions data from states themselves, they should have caclulated the average from that data, not pulled the average from another potentially completely different measure. Then their conclusions would have been valid regardless what weird scaling factor they bought in to their state calculations. The numbers might have been wrong but the proportion would have been accurate, and it is the proportion that is being highlighted.
> AI is unavoidable
> We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode
This is surely meant to be an objective assessment, not a fluff piece.