Surely SamA doesn’t actually think that they’ll more than 20x their compute in a few years? I’m sure the researchers there would love to do more research, with more compute, faster, but 20+x growth is not a practical expectation.
Is the goal here to create a mad rush to build data centers, which should decrease their costs with more supply? Do they just want governments to step in and to help somehow? Is it part of marketing/hype? Is this trying to project confidence to investors on future revenue expectations?
p1esk 1 days ago [-]
Surely SamA doesn’t actually think that they’ll more than 20x their compute in a few years?
If their goal is to train say, a 100T model on the whole youtube dataset they will need 20000x more compute. And that would be my goal if I were him.
bix6 1 days ago [-]
Why 20000x more compute? I thought they were at approx 1T with current compute?
Edit: looked it up. 10k+ times more for training compute. Sheesh. Get the Dyson sphere ready lol.
"At this point I think I know more about manufacturing than anyone currently alive on Earth."
It's that dumbass at your work who thinks that solely because he landed a job that pays him more than their parents ever made combined in his early 20s he can school everyone on every topic imaginable, from nutrition to religion.
Him and Elon makes way more than that dumbass so ego get inflated even more.
I don't especially like Tucker Carlson, but I think the more screen time we'll give to this people with an open mic it's better for everyone to have a first hand experience of how detached from reality these people are.
blitzar 23 hours ago [-]
It is better to remain silent and be thought a fool than to open your mouth and remove all doubt
IT4MD 15 hours ago [-]
Thanks for the example.
johnnienaked 23 hours ago [-]
Absolutely right, and it's ubiquitous across organizations too.
I've never met an executive I respect. They're all absolute experts at appearing competent.
aswegs8 16 hours ago [-]
I mean, they're selected for it so that's not surprising
johnnienaked 13 hours ago [-]
I guess the surprising part is that appearing competent is more important to shareholders than being competent
estimator7292 10 hours ago [-]
Actually checking if someone is competent requires actual work, though. Work is for lesser people. Shareholders just know if a person is or is not competent, that's why they have so much money, right?
b00ty4breakfast 14 hours ago [-]
I can never tell if these guys have come to genuinely love the smell of their own farts or if they're just constantly in sales mode. Like maybe all those hours in meetings with investors and shareholder or whatever has gotten them stuck, like your mom used to warn you about when you'd make faces at your little brother.
dmbche 11 hours ago [-]
If they know it won't bring in revenue, they can't get out of "sales mode" because when the runway stops they get left out. Like musical chairs with one chair left, you want to keep the game going if you don't think you can get it. And you're filthy rich as long as the game's going.
llbbdd 11 hours ago [-]
When your job is to constantly be making the pitch for your company, and you live in a world where every conversation you have can be news before the end of the day, the mask can never come off.
kadushka 1 days ago [-]
Mainly because global video data corpus is > 100k larger than global text corpus, so you will need to train much larger models for much longer (than current LLMs).
Drblessing 1 days ago [-]
That would be awesome.
zingerlio 1 days ago [-]
The AI bubble bursts when he stumbles to get that money.
ants_everywhere 1 days ago [-]
If they want to survive they need to outrun Google and have a differentiated service. Which as of now it's not clear that OpenAI will have a reason to exist in the near future with Anthropic and Google.
They're likely betting on either training a model so big they can't be ignored or possibly focusing more B2B which means lots of compute to resell.
CuriouslyC 13 hours ago [-]
If their plan was to go toe to toe with Google as a foundation model/inference provider they would 100% be getting ground to dust, that's not a winnable fight. There's a reason they've pivoted to product and retained Jony Ive.
Anthropic gets a TON of hate on social media, their models are fragile, their infra is poorly managed, they're 100% going to end up in Jeff's pocket. OpenAI is a survivor.
Szpadel 12 hours ago [-]
I believe it is because of RL
you are no longer limited by training data as you generate it during learning on the fly
so benchmark driven learning could scale with compute
they also seem to assume that everyone will use AI from them in the future, especially with new "pulse" combined with ads. scaling this will need much more compute.
is this reasonable? I'm not convinced, but this is how I believe it's their reasoning
tim333 14 hours ago [-]
My guess:
Altman figures AI will be a big deal and constrained by avaiable compute.
If he locks in all the available compute and related finance before the competition then he's locked in as the #1 AI company.
I'm not sure 20x or 5x or 40x matters, nor revenue expectations, so much as being ahead of the competition.
pram 1 days ago [-]
Perceptually, it helps to take scrutiny off the current spend? It isn't a bubble if you can just scoff at $100 billion and say like" "thats pocket change, this will actually require 10 quadrillion dollars!!"
Tubelord 8 hours ago [-]
Pascal’s wager applied to tech cycles. The fervent adherence to the hype is akin to religious zealots in many ways
matt-p 7 hours ago [-]
I think they could probably 'use' 10X. There are rumours that one of the reasons they're not shipping the new Jonny Ive device is that they haven't got the compute for it. If you need 10X probably better to ask for 20x and have a glut than ask for 10x and have a shortage.
arthurofbabylon 14 hours ago [-]
The phrase you are looking for is “commodifying the periphery.” As adjacent bottlenecks open up, the bottleneck you control becomes more valuable.
indolering 1 days ago [-]
In the early days of Bitcoin, we would argue security models and laugh about Bitcoin mining taking some significant percentage of global power supply. It's been giving around 1% for a while now despite the supply falling off.
I wouldn't put bets on what the outer limits for AI are going to be. However, it's a huge productivity boost across a huge range of workflows. Models are still making large gains as they become more sophisticated.
If I had Sam Altman's access to other people's capital, I would be making large bets that it will keep growing.
skywhopper 1 days ago [-]
He needs 11 figures of cash injected as soon as possible. The people who can give it want a big return. Given the current losses, the only way to make the math right is to lie outrageously about what’s possible.
karmakurtisaani 22 hours ago [-]
He's been kicking this can for years now. Looking forward to the day he's forced to stop.
1 days ago [-]
refulgentis 1 days ago [-]
> Surely SamA doesn’t actually think that they’ll more than 20x their compute in a few years?
He does, or at least, he believes if its plausible they should attempt to.
We live in odd times. It sort of reminds me of Feb 2020. All you really needed to know was the Rt and rest was just math. Who knows if it'll matter or pencil out in a decade, but, it's completely reasonable at these growth rates and with the iron laws known to keep scaling.
jgalt212 13 hours ago [-]
> What is their angle with this?
My pet theory: Sam makes more money when OpenAI spends than when OpenAI earns.
buyucu 15 hours ago [-]
Why does OpenAI need so much more compute than everybody else? DeepSeek, Qwen and many others build competitive models that need much less capital.
yorwba 14 hours ago [-]
Chinese companies need to pay much higher prices for the same GPUs, so they would need to charge more to make a profit, but it's difficult to charge more unless they have a much better product. So building massive data centers to gain market share is riskier for them.
That said, Alibaba not releasing the weights for Qwen3-Max and announcing $53 billion in AI infrastructure spending https://www.reuters.com/world/china/alibaba-launches-qwen3-m... suggests that they think they're now at a point where it makes sense to scale up. (The Reuters article mentions data centers in several countries, which I assume also helps work around high GPU prices in China.)
Circling back to OpenAI: I don't think they're spending so much on infrastructure just because they want to train bigger models on more data, but moreso because they want to serve those bigger models to more customers using their services more intensively.
credit_guy 15 hours ago [-]
Most likely OpenAI has models at least as efficient as DeepSeek or Qwen. Cerebras offers both GPT-OSS-120B and Qwen3-235B-Instruct. Obviously, the second has twice as many parameters as the first, but that's the closest comparison I can find. The Qwen model is twice as large, but twice as slow (1400 tokens/second vs 3000) and 50% more expensive ($1.2 per million tokens vs $0.75). Now, OpenAI is running a proprietary model, and most likely it is much optimized than the free version they release for public use.
Inference is not the main cost driver, training and research is.
credit_guy 8 hours ago [-]
I'm not sure that's still the case. It used to be the case, but I doubt it continues to be. OpenAI had $6.7 BN costs for the first half of 2025. I doubt they spent $3 BN in training and research. They have 700 million weekly users, and many of these users are really heavy users. Just taking myself: I probably consumed a few million tokens with GPT-5-Codex in the last 3 days alone. I am a heavy user, but I think there are users who burn through hundreds of times more tokens than me.
matt-p 7 hours ago [-]
Absolutely not true.
matt-p 7 hours ago [-]
Because they have more distribution than anyone else? Pretty much everyone uses chat.com but almost no-body uses deepseek.
halJordan 4 hours ago [-]
They dont, your question is just as wrong as asking, "Why does LockheedMartin need so many material science engineers? Chengdu Aircraft makes fighters with 10x fewer"
lvl155 13 hours ago [-]
They’re trying to lock out competition from accessing compute.
Incipient 1 days ago [-]
Is it just me, or does this extreme demand for compute imply they've realised the core tech is mostly stagnant, and needs compute to possibly scale towards anything AGI-like? (however unlikely that is).
general1465 21 hours ago [-]
I see it the same way. It is like automotive manufacturer who is using bigger engines with more and more pistons and wondering how much bigger engine next model iteration will need to make it go faster and how many iterations until they will finally break the sound barrier. However their product is looking like a school-bus box on wheels which is going to rip itself apart long before even reaching the sound barrier.
adventured 13 hours ago [-]
They're substantially tied down by demand/usage right now.
How much more compute would they need to allow all of their paying users unlimited access to their best model? And to enable that setup in such a way that it's actually very fast.
The answer: they need resources far beyond what they have now. That's just to solve an existing problem.
Then throw in Sora 4 and whatever else will exist in a few years, and the need to feed that monster. They couldn't come close to allowing Sora 2 unlimited for all of their paying customers - I'd hate to see what that would require.
Then let the AI begin world building for every user (which is where this is going). It'll be decades before the resource demands actually flatten globally (at least 20-30 years, to get to global population saturation on usage; assuming the global population will begin to rapidly slow and then shrink).
Hint: the solution to Fermi's Paradox is that we go into the box and we never come back out, because it's a lot more interesting for 99.9%+ of humanity than a bunch of repeating rocks in space that take a zillion years to reach. The core purpose of AI will be to world build for us, mentally (relaxation, stimulation, entertainment, social) in we go: the end. The same thing happens to any advanced beings that get to our level, there's little to nothing out there that we can reach of interest (and no, it doesn't matter if the HN crowd disagrees, what matters for this outcome are the masses), we'll definitely figure that out, and there's infinite stimulation/experience in the machine world by contrast.
Drblessing 1 days ago [-]
The AI-powered tiktok competitor is not going to be cheap on compute
banandys 1 days ago [-]
I mean yeah we all saw the video of him stealing gpus and getting arrested
All for creating worthless TikTok videos with Sora 2, while we don't get graphics cards and DRAM and our electricity prices rise.
Trump will get another "win" for "his" Stargate project. The meeting with South Korean President Lee Jae Myung was NOT arranged by Altman, he is the messenger boy:
And our water runs out, and we pollute and destroy the planet past the point of no return.
AI will fix it though?
mhh__ 2 hours ago [-]
I believe the latest project is to teleport the water to Mars, after all it must be disappearing
logtempo 16 hours ago [-]
Hey, at least you'll be able to add that exterminated tigre specie in your postcard from your last adventure trip. And more water to that river, with some greener trees etc.
All of that without leaving your home ofc.
panta 14 hours ago [-]
Of course. According to Andreessen if you are not optimistic and worry about the environment are an "enemy" for the bright future ahead (while at the same time he puts Nick Land in the list of the "Saints"). These people are deranged psychopaths, why are we leaving them at the wheel?
johnnienaked 13 hours ago [-]
Sounds about right coming from the guy who plagiarized university research to make his billions
Too big to fail is the goal. If the world is powered by openai but it aint making a profit in 2028 they can just put their "were a utility like water" facemask on and get bailed out.
stevenwoo 1 days ago [-]
At least in the USA, I think if consumers realized their power bills going up every year are tied to these new data centers, there would be more opposition to data centers going up politically.
https://apnews.com/article/electricity-prices-data-centers-a...
I don't know if the electricity markets work differently in other countries.
omcnoe 1 days ago [-]
The US needs sufficient energy surplus to power industry. US energy production has been essentially flat for the past 25 years and the country has forgot how to bring new capacity online. Chinese energy production is up over 6x over the same period. China has more clean energy generation capacity today than their entire capacity a little over a decade ago.
Instead of panicking about data center electricity usage we need to be worrying about getting back to a state where we regularly bring new (clean) generation capacity online.
chermi 14 hours ago [-]
That's not the main problem. That's the convenient scapegoat so we don't get mad about the real problem. Power bills have been going up for years. We're just not good at generating and serving sufficient energy. Our grid sucks, our utilities suck and can do whatever they want, and we can't build anything. And the grid problems get worse as we add renewables as they have to manage more complex generation profiles. (I'm all for renewables, love solar.)
MountDoom 1 days ago [-]
Taxpayers subsidize data centers in many other ways. These are prestige projects for politicians, so they often get long-term tax breaks and other preferential treatment.
I think it's part vanity, part a misunderstanding about the economic benefits of a datacenter (which are nearly nil, as they employ very few people and produce nothing for the local market), and part just a desire to score brownie points with wealthy corporations, which might mean donations, campaign support, or other perks for the politician who makes it happen.
fooker 1 days ago [-]
It's correlated to be data centers, not tied to. That's an important difference.
We could easily build ..say.. 10 nuclear reactors and halve the utility bills with amortization.
noosphr 1 days ago [-]
The power bill going up is because the US, and the West in general, bet on renewables and a low energy future.
Neither of those things turn out to be a good fit for the new economy. The only thing left for people who derided nuclear for the last 40 years is to hope this is a bubble that sends us back to the 17th century when it pops. Anything else means we have to invest trillions in nuclear right now.
DavidPiper 1 days ago [-]
Genuine question from a non-American: What is "the new economy"?
noosphr 1 days ago [-]
Malthusianism for computers.
Moore's law is dead. The only way to increase compute is to increase the power we feed to computers. AI is just the shiniest example. Everything else will follow suit until electricity costs increase enough that it doesn't make sense to throw any more computation at it.
Any country that doesn't have energy to spare will be in the position of countries which didn't have food to support armies before the industrial revolution.
fooker 1 days ago [-]
Interesting point. I can see this could turn out to be true.
If we needed, for example, 1000 TWH to power AI for a huge drone swarm but could not do it while China could, this would be problematic.
It requires a future where MAD with nuclear weapons is obsolete though, with some futuristic new missile defense tech. I don't see that happening until some currently unknown physics makes it possible.
jimbob45 1 days ago [-]
What’s your beef with solar? Both parties seem to like it just fine, despite it not covering as much of the total demand as anyone would like.
noosphr 22 hours ago [-]
Both parties like it better because it turns the electricity market into another casino that lets you take billions from the parts of the economy that do things.
I worked as a quant in the electric market. There wasn't a single dataset I saw where more renewables resulted in lower total costs for consumers.
Drblessing 1 days ago [-]
The amount of money being invested in AI should've been invested into nuclear, both fusion and fission. The AI bubble will burst, but the energy bubble never bursts.
kortilla 1 days ago [-]
That’s a fun trope but it’s a terrible outcome for shareholders.
avs733 1 days ago [-]
Which means it will be made into a terrible outcome for everyone.
jacquesm 1 days ago [-]
Good.
throwaway290 1 days ago [-]
shareholders like a business that can never fail...
lemonlearnings 1 days ago [-]
Less terrible than being allow to go bust though.
nelsonic 1 days ago [-]
[flagged]
bwfan123 1 days ago [-]
> Definitely not a bubble. ;)
When you square that with the total revenues of 6B in 1H.
roenxi 1 days ago [-]
They haven't put ads in ChatGPT yet as far as I know. It doesn't really make sense to work off the raw revenue before they put the main revenue-raiser in. Search isn't a very impressive money maker either if we take AdWords out.
sverhagen 1 days ago [-]
I have a subscription. Dang, who does every profit model have to involve ads?
fooker 1 days ago [-]
Because not everyone wants to pay for a subscription, certainly not the 5 billion Asians in any reasonable numbers.
There's a reason Google and Facebook run the tech world as we know it.
thereitgoes456 1 days ago [-]
You and everyone else seem to assume on faith that OpenAI's ads revenue is going to dwarf their subscription revenue -- but you're being suckered. If you do the math, you'll find it's not nearly as clear-cut as you think.
Not impossible, but not a given.
og_kalu 1 days ago [-]
>If you do the math
On the contrary the math makes it very clear. They need a free user ARPU of $11 to 12 per quarter to be profitable with billions to spare.
That's a low bar to clear for a platform with 700M+ Weekly Active Users who are more personal with it than any Google search.
leptons 13 hours ago [-]
Their total losses far exceed any revenue they are taking in.
lemonlearnings 1 days ago [-]
Bitcoin:
AI: Hold my beer.
username223 1 days ago [-]
> $16Bn in compute spend (rent) rising to $400bn in 4 years. Definitely not a bubble. ;)
Just to put that in context, US GDP last year was about $30tn, and Apple's revenue was about $400bn. So Altman is saying he wants to spend around 1% of US GDP, or most of Apple's revenue, on compute alone in 2029. He's clearly using the "fire all your white collar employees" pitch deck at the very least, if not the "prepare to meet your Silicon God" one.
dgfitz 1 days ago [-]
I also feel like there is a bubble.
However, looking at it impartially, it seems to mean sama is planning on making more money. Which seems to conclude in ads are incoming.
ares623 1 days ago [-]
We give a lot of shit when LLMs get lost in context, digging themselves deeper and deeper into illogical trains of thoughts.
But is this sama doing the same thing. He CAN NOT stop and MUST NOT slow down. The slightest display of hesitation and doubt will make it all come crashing down.
And everyone else is doing the same. It’s an international game of chicken.
nakamoto_damacy 1 days ago [-]
[flagged]
nebula8804 1 days ago [-]
Its cheap because you aren't paying the full cost, you are externalizing some of the costs onto others.
Surely SamA doesn’t actually think that they’ll more than 20x their compute in a few years? I’m sure the researchers there would love to do more research, with more compute, faster, but 20+x growth is not a practical expectation.
Is the goal here to create a mad rush to build data centers, which should decrease their costs with more supply? Do they just want governments to step in and to help somehow? Is it part of marketing/hype? Is this trying to project confidence to investors on future revenue expectations?
If their goal is to train say, a 100T model on the whole youtube dataset they will need 20000x more compute. And that would be my goal if I were him.
Edit: looked it up. 10k+ times more for training compute. Sheesh. Get the Dyson sphere ready lol.
It's that dumbass at your work who thinks that solely because he landed a job that pays him more than their parents ever made combined in his early 20s he can school everyone on every topic imaginable, from nutrition to religion.
Him and Elon makes way more than that dumbass so ego get inflated even more.
I don't especially like Tucker Carlson, but I think the more screen time we'll give to this people with an open mic it's better for everyone to have a first hand experience of how detached from reality these people are.
I've never met an executive I respect. They're all absolute experts at appearing competent.
They're likely betting on either training a model so big they can't be ignored or possibly focusing more B2B which means lots of compute to resell.
Anthropic gets a TON of hate on social media, their models are fragile, their infra is poorly managed, they're 100% going to end up in Jeff's pocket. OpenAI is a survivor.
they also seem to assume that everyone will use AI from them in the future, especially with new "pulse" combined with ads. scaling this will need much more compute.
is this reasonable? I'm not convinced, but this is how I believe it's their reasoning
Altman figures AI will be a big deal and constrained by avaiable compute.
If he locks in all the available compute and related finance before the competition then he's locked in as the #1 AI company.
I'm not sure 20x or 5x or 40x matters, nor revenue expectations, so much as being ahead of the competition.
I wouldn't put bets on what the outer limits for AI are going to be. However, it's a huge productivity boost across a huge range of workflows. Models are still making large gains as they become more sophisticated.
If I had Sam Altman's access to other people's capital, I would be making large bets that it will keep growing.
He does, or at least, he believes if its plausible they should attempt to.
We live in odd times. It sort of reminds me of Feb 2020. All you really needed to know was the Rt and rest was just math. Who knows if it'll matter or pencil out in a decade, but, it's completely reasonable at these growth rates and with the iron laws known to keep scaling.
My pet theory: Sam makes more money when OpenAI spends than when OpenAI earns.
That said, Alibaba not releasing the weights for Qwen3-Max and announcing $53 billion in AI infrastructure spending https://www.reuters.com/world/china/alibaba-launches-qwen3-m... suggests that they think they're now at a point where it makes sense to scale up. (The Reuters article mentions data centers in several countries, which I assume also helps work around high GPU prices in China.)
Circling back to OpenAI: I don't think they're spending so much on infrastructure just because they want to train bigger models on more data, but moreso because they want to serve those bigger models to more customers using their services more intensively.
[1] https://inference-docs.cerebras.ai/models/overview
How much more compute would they need to allow all of their paying users unlimited access to their best model? And to enable that setup in such a way that it's actually very fast.
The answer: they need resources far beyond what they have now. That's just to solve an existing problem.
Then throw in Sora 4 and whatever else will exist in a few years, and the need to feed that monster. They couldn't come close to allowing Sora 2 unlimited for all of their paying customers - I'd hate to see what that would require.
Then let the AI begin world building for every user (which is where this is going). It'll be decades before the resource demands actually flatten globally (at least 20-30 years, to get to global population saturation on usage; assuming the global population will begin to rapidly slow and then shrink).
Hint: the solution to Fermi's Paradox is that we go into the box and we never come back out, because it's a lot more interesting for 99.9%+ of humanity than a bunch of repeating rocks in space that take a zillion years to reach. The core purpose of AI will be to world build for us, mentally (relaxation, stimulation, entertainment, social) in we go: the end. The same thing happens to any advanced beings that get to our level, there's little to nothing out there that we can reach of interest (and no, it doesn't matter if the HN crowd disagrees, what matters for this outcome are the masses), we'll definitely figure that out, and there's infinite stimulation/experience in the machine world by contrast.
https://www.tomshardware.com/pc-components/dram/openais-star...
All for creating worthless TikTok videos with Sora 2, while we don't get graphics cards and DRAM and our electricity prices rise.
Trump will get another "win" for "his" Stargate project. The meeting with South Korean President Lee Jae Myung was NOT arranged by Altman, he is the messenger boy:
https://www.reuters.com/business/media-telecom/samsung-sk-hy...
AI will fix it though?
All of that without leaving your home ofc.
Instead of panicking about data center electricity usage we need to be worrying about getting back to a state where we regularly bring new (clean) generation capacity online.
I think it's part vanity, part a misunderstanding about the economic benefits of a datacenter (which are nearly nil, as they employ very few people and produce nothing for the local market), and part just a desire to score brownie points with wealthy corporations, which might mean donations, campaign support, or other perks for the politician who makes it happen.
We could easily build ..say.. 10 nuclear reactors and halve the utility bills with amortization.
Neither of those things turn out to be a good fit for the new economy. The only thing left for people who derided nuclear for the last 40 years is to hope this is a bubble that sends us back to the 17th century when it pops. Anything else means we have to invest trillions in nuclear right now.
Moore's law is dead. The only way to increase compute is to increase the power we feed to computers. AI is just the shiniest example. Everything else will follow suit until electricity costs increase enough that it doesn't make sense to throw any more computation at it.
Any country that doesn't have energy to spare will be in the position of countries which didn't have food to support armies before the industrial revolution.
If we needed, for example, 1000 TWH to power AI for a huge drone swarm but could not do it while China could, this would be problematic.
It requires a future where MAD with nuclear weapons is obsolete though, with some futuristic new missile defense tech. I don't see that happening until some currently unknown physics makes it possible.
I worked as a quant in the electric market. There wasn't a single dataset I saw where more renewables resulted in lower total costs for consumers.
When you square that with the total revenues of 6B in 1H.
There's a reason Google and Facebook run the tech world as we know it.
Not impossible, but not a given.
On the contrary the math makes it very clear. They need a free user ARPU of $11 to 12 per quarter to be profitable with billions to spare.
That's a low bar to clear for a platform with 700M+ Weekly Active Users who are more personal with it than any Google search.
AI: Hold my beer.
Just to put that in context, US GDP last year was about $30tn, and Apple's revenue was about $400bn. So Altman is saying he wants to spend around 1% of US GDP, or most of Apple's revenue, on compute alone in 2029. He's clearly using the "fire all your white collar employees" pitch deck at the very least, if not the "prepare to meet your Silicon God" one.
However, looking at it impartially, it seems to mean sama is planning on making more money. Which seems to conclude in ads are incoming.
But is this sama doing the same thing. He CAN NOT stop and MUST NOT slow down. The slightest display of hesitation and doubt will make it all come crashing down.
And everyone else is doing the same. It’s an international game of chicken.
[1]:https://www.youtube.com/shorts/jNFemZpMadU