Potentially unpopular take: memory manufacturers have been operating on the margins of profitability for quite a while now. Their products are essentially an indistinguishable commodity. Memory from Samsung or Micron or another manufacturer may have slight differences in overclockability, but that matters little to folks who just want a stable system. Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity. That would be great for everyone. Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at. With these margins, any new manufacturer could gain a foothold.
If LLMs' utility continues to scale with size (which seems likely as we begin training embodied AI on a massive influx of robotic sensor data) then it will continue to gobble up memory for the near future. We may need both increased production capacity _and_ a period of more efficient software development techniques as was the case when a new 512kb upgrade cost $1,000.
Aurornis 2 days ago [-]
> Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity.
Most DRAM is already purchased through contracts with manufacturers.
Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
Like most commodities, the price you see on places like Newegg follows the "spot price", meaning the price to purchase DRAM for shipment immediately. The big players don't buy their RAM through these channels, they arrange contracts with manufacturers.
The contracts with manufacturers will see higher prices in the future, but they're playing the long game and will try to delay or smooth out purchasing to minimize exposure to this spike.
> Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at.
Companies like Samsung and SK Hynix have DRAM fabs in China already. This has been true for decades. You may have Chinese fab'd DRAM in the computer you're using right now.
Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
throwaway2037 2 days ago [-]
> Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
I don't agree with this sentence. Why would not the same apply advice to oil and gas contracts? If you look at the size and duration of oil and gas contracts for major energy importers, they often run 10 years or more. Some of the contracts in Japan and Korea are so large, that a heavy industrial / chemical customers will take an equity stake in the extraction site.
Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
overfeed 1 days ago [-]
> Why would not the same apply advice to oil and gas contracts?
Because oil & gas suppliers only ever sell one product, and memory fabs can dynamically switch product mix in response to supply & demand to optimize profits. The same sand, power and water can make DDR4, HBM or DDR5
throwaway48476 1 days ago [-]
Oil and gas suppliers have several products: gas, diesel, jet a, propane, naptha, asphalt etc.
bmicraft 1 days ago [-]
Aren't the proportions is those essentially static?
ahartmetz 19 hours ago [-]
Cracking can turn heavier oil fractions into lighter oil fractions. It's a very common procedure.
eternauta3k 1 days ago [-]
No, refineries make more heating oil in winter and more gasoline in summer driving season.
pletnes 1 days ago [-]
Depends a lot on the oil field, geology is random
adastra22 1 days ago [-]
none of those finished products come out of the ground that way.
buran77 1 days ago [-]
Neither do chips, even if they all start as silicon from the ground. What the earlier comment was saying is that the actual composition of crude oil varies by location so you aren't necessarily getting the same ratio of finished products at the process. With silicon you have a bit more control over what goes into the fab. But you're still at the mercy of demand from the market.
throwaway48476 23 hours ago [-]
The crude composition defines a range of possible products, not exactly ratios. Longer chain hydrocarbons are also cracked to yield more light products.
buran77 20 hours ago [-]
> defines a range of possible products, not exactly ratios
I'm not sure I follow, varying range necessarily implies varying ratios (e.g. a product missing from the range means its ratio is zero).
Even when in theory you can obtain some higher quality products, the composition of the crude can make it too complex and expensive to practically obtain them.
You don't want to refine gasoline from heavy crude, especially in winter when demand is lower. For gasoline or kerosene you want to start from lighter crude. Same with many undesired components (either from the crude or resulting from the refining methods), the more you have, the more complex the refining, and the resulting ratio of products you obtain varies.
So in practice what you get out of the refining process absolutely depends on the characteristics of the crude, and many other things like market demand or the capability of your refinery.
Same as with silicon. The process to make the wafer results in different quality if you want to make low tech or cutting edge semiconductor products.
pletnes 21 hours ago [-]
That way? I was trying to say that the mix of hydrocarbon molecules is different for each and every oil field due to local geological variation. Even within the field, since eg lighter molecules presumably come out first.
1 days ago [-]
1 days ago [-]
chickensong 1 days ago [-]
The factory must grow
metaphor 1 days ago [-]
Are you seriously trying to compare raw commodity inputs traded on the futures market to finished semiconductor products that are expected to become deprecated, uncompetitive and/or EOL'd in a few years?
misja111 1 days ago [-]
Yes it looks like he does. And I don't see why not.
The fact that their products become deprecated, gives even more incentive to manufacturers to want long term contracts.
anonymars 24 hours ago [-]
Now do a comparison to hog or corn futures
saxenaabhi 1 days ago [-]
Same does apply to gas contracts. So many a time LNG companies break contracts and pay hefty penalties if the spot rate is high enough.
mlrtime 1 days ago [-]
>So many a time LNG companies break contracts and pay hefty penalties if the spot rate is high enough.
What do you mean "Break contracts"? I thought the conversation was about Futures contracts, you don't break them. You sell your contract or you take/give delivery (or cash settle).
georgefrowny 22 hours ago [-]
There's no specific mention of futures upthread of this comment.
Not all gas is sold by futures, you can have a contract for, say, delivery of 20 million cubic metres of gas a year and a penalty if that isn't met. Some people actually want the gas for gas-related purposes rather then as a financial phantom.
Same for DRAM - Dell actually wants the chips to put in computers, an economic abstraction doesn't help much when you need to ship real computers to get paid, and many customers aren't in the market for a laptop future (Framework pre-orders notwithstanding).
throwaway2037 13 hours ago [-]
As I understand hydrocarbon trading (oil and gas), futures is a tiny portion of the settled market. The vast majority is traded through long-term, privately negotiated contracts. As I said previously, many of those contracts are so large that the end buyer takes an equity stake in the extraction site.
Great point. You are the only one who mentioned this! Example: photoresist film.
Tuna-Fish 1 days ago [-]
> Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
Borrowing costs can be wildly variable and are the main cost of making silicon. All the "inputs" over the lifecycle of a fab are so completely dwarfed by the initial capital costs that you can pretty much ignore them in any economic analysis. The cost of making chips is the cost of borrowing money to pay for capital costs, and the depreciation of the value of that capital.
throwaway2037 23 hours ago [-]
This theory sounds nice, but do you have any sources to share? For example, I assume a fab runs for about 20-30 years. The labor inputs must be very high over this period. Basically, there are no poorly paid people inside a fab. And wouldn't "wildly variable borrowing costs" also affect oil and gas who need to finance the research phase and construction of the plant?
Tuna-Fish 21 hours ago [-]
> For example, I assume a fab runs for about 20-30 years.
If only.
20 years ago, fabs were being built to use 90nm class technology. Chips made on such an old node are so cheap today it can't pay even fraction of a percent of the capital costs of the plant per year. So all of it's capital has to have been depreciated a long time ago.
The oldest process node in high-volume production for memory is currently 1α, which started production in January 2021. It is no longer capable of making high-end products and is definitely legacy, and also has to have essentially depreciated all of the capital costs. The time a high-end fab stays high-end and can command premium prices, and during which it has to depreciate all the capital is ~3-5 years. After that either you push the plant to produce legacy/low price and low margin items, or you rebuild it with new tools with costs >$10B.
Also, even if fabs did last 20-30 years, the capital costs would dominate.
> And wouldn't "wildly variable borrowing costs" also affect oil and gas who need to finance the research phase and construction of the plant?
I don't understand? Nothing else costs anywhere near as much capital to produce than silicon chips. Thanks to the inexorable force of Moore's second law, fabs are machines that turn capital investment into salable product, nothing like it has ever existed before.
georgefrowny 21 hours ago [-]
Micron Fab 6 is about 2300 staff (including 1000 contractors).
Even if you pay them all 500k per year, that's "only" about a billion a year in payroll.
The New York fab plan costs something like 20 billion more or less now to build, with 100 billion over 20 years.
Also, maybe the calculus is different right now in the US, but it used to be the semiconductor workers were expected to have PhDs coming out of their ears but were not actually paid very well, with salaries in Taiwanese fabs being around the $50-60k mark, and lower paid workers being more like $20k or less. Presumably US fabs will be automated to an even greater extent due to labour costs.
So it's very possible that servicing debt on the capital outlay is substantially more expensive than the payroll.
timschmidt 2 days ago [-]
> Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
As I mentioned, various groups within China have been working on China-native DRAM since the '70s. What's new are the margins and market demand to allow them to be profitable with DRAM which is still several years behind the competition.
throwaway48476 1 days ago [-]
Lots of low end android boxes use cxmt ddr4.
lwhi 1 days ago [-]
I've just remembered a similar situation in 1993!
A Japanese factory that made epoxy resin for chips was destroyed and the price of SIMM chips skyrocketed (due to lack of availability).
I remember being very upset that I wasn't going to be able to upgrade to 4MB.
hmng 1 days ago [-]
There should be a t-shirt for that ;-)
I remember paying insane prices that year.
wkat4242 2 days ago [-]
Well, what really prompted this crisis is AI, as well as Samsung shutting down some production (and I have to say I don't think they mind that the pricing has skyrocketed as a result!)
But yes we're going to need more fabs for sure
embedding-shape 2 days ago [-]
> Well, what really prompted this crisis is AI,
If the shortage of RAM is because of AI (so servers/data centers I presume?), wouldn't that mean the shortage should be localized to RDIMM rather than the much more common UDIMM that most gaming PCs use? But it seems to me like the pricing is going up more for UDIMM than RDIMM.
wmf 2 days ago [-]
UDIMM and RDIMM use the same DRAM chips. And my understanding is that the fabs can switch between DDR5, LPDDR5, and maybe HBM as needed. This means high demand for one type can create a shortage of the others.
embedding-shape 1 days ago [-]
> This means high demand for one type can create a shortage of the others.
Wouldn't that mean that a shortage of DRAM chips should cause price difference in all of them? Not sure that'd explain why RDIMM prices aren't raising as sharply as UDIMM. That the fab and assembly lines have transitioned into making other stuff makes sense why'd there be a difference though, as bradfa mentioned in their reply.
Aurornis 2 days ago [-]
It's a valid question if you're not familiar with the RAM market. Sorry you're getting downvoted for it.
The manufacturers make the individual chips, not the modules (DIMMs). (EDIT: Some companies that make chips may also have business units that sell DIMMS, to be pedantic.)
The R in RDIMM means register, aka buffer. It's a separate chip that buffers the signals between the memory chips and the controller.
Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
It can be confusing. The key thing to remember is that the price is driven by the price of the chips. The companies that make DIMMs are buying chips in bulk and integrating them on to PCBs.
consp 2 days ago [-]
> Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
Quite a few unbuffered designs in the past had a "missing chip". If you ever wondered why a chip was missing on your stick, it's missing ECC. Don't know if it's still the case with DDR5 though.
buildbot 2 days ago [-]
I have not seen that yet DDR5, I think the signal integrity requirements are too high now to even have unused pads open. Most sticks don’t appear to have many traces at all on the top/bottom sides, just big power/ground planes.
Also with DDR5 each stick is actually 2 channels so you get 2 extra dies.
bbarnett 1 days ago [-]
There's some new half assed ECC type of RAM, not sure the name.
Was reading a series of displeased posts about it. Can't seem to find it now.
lmz 1 days ago [-]
On die ECC for DDR5. Which corrects locally but does not signal the host or deal with data between the die and the CPU.
bbarnett 1 days ago [-]
Thanks, drove me bananas trying to find that again.
throwaway48476 1 days ago [-]
There's 9 bits in an ECC byte.
bradfa 2 days ago [-]
Because manufacturers transitioned fab and assembly lines from low margin dram to higher margin products like hbm, hence reducing dram supply. But the demand for consumer grade dram hasn’t changed much so prices for it go up.
2 days ago [-]
PunchyHamster 2 days ago [-]
The chips come from same factory. And difference betweeen those two is... a buffer chip. And extra ram die for ECC
dboreham 2 days ago [-]
Same chips in both, made in the same fabs. Any relative price difference is like the difference between regular and premium gas/petrol.
dylan604 2 days ago [-]
wait, are you saying that there's no difference between regular and premium gas?
MichaelNolan 1 days ago [-]
The “regular” and “premium” label at the pump is misleading. The premium gas isn’t better. It’s just different. Unless your car specifically requires higher octane fuel, there is no benefit to paying for it. https://www.kbb.com/car-advice/gasoline-guide/
throwaway48476 1 days ago [-]
You get slightly better mpg on premium, just not enough to justify the cost.
addaon 1 days ago [-]
Not unless you’re adjusting timing. Premium gas has lower energy per unit mass and per unit volume than standard gas.
fragmede 1 days ago [-]
> Not unless you’re adjusting timing.
Which, every modern ECU will do automatically based on output from the knock sensors.
timschmidt 2 days ago [-]
This may surprise you, but gas stations typically only have two grades of fuel stored in tanks. Mid-grade gas is mixed at the pump from the other two.
PunchyHamster 2 days ago [-]
no, but made in same place with mostly same ingredients, just different ratio to hit higher octane (and in some cases some extra additives).
Also vary a bit between winter a summer, basically in winter they can get away with putting a bit more volatile compounds coz it's colder
caycep 2 days ago [-]
It's a sad trend for "the rest of us" and history in general. The economic boom of the 80's thru the 2010s has been a vast democratization of computation - hardware became more powerful and affordable, and algorithms (at least broadly if not individually) became more efficient. We all had supercomputers in our pockets. This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.
Aurornis 2 days ago [-]
> This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
Compute is cheaper than ever. The ceiling is just higher for what you can buy.
Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.
Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.
Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
lmm 1 days ago [-]
> Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.
alex43578 1 days ago [-]
Excluding a few poorly-optimized recent releases, what games can't run on like a 3070?
nottorp 1 days ago [-]
3070 isn't mid range.
Low end is ryzen integrated graphics now, xx60 is mid range at best. Maybe even xx50 if those still exist.
sofixa 22 hours ago [-]
> xx60 is mid range at best. Maybe even xx50 if those still exist
"It's mid range if it exists" doesn't make sense.
Also you're missing that they're talking about 3070, a card from 2020 (5 years ago), 2 generations behind this year's 50xx series. The 30xx matters more than the xx70 here. It was an upper midrange card when it came out, and it's solidly midrange for Nvidia's product lineup today. You can have cheaper and decent just fine (integrated Ryzens like you mentioned are fine for 1080p gaming on most titles).
ed_elliott_asc 2 days ago [-]
The thing about being annoyed about the top of the range prices, for me, it irritates as it feels like it drags the lower models prices upwardsz
nottorp 1 days ago [-]
It does. If the top range is 80k you'll feel you're getting a deal for 40k.
So no one makes a 25k model.
woodson 2 days ago [-]
But it’s not like the lower priced models are subsidizing the high-end models (probably the opposite; the high-end ones have greater margins).
23 hours ago [-]
fzeroracer 1 days ago [-]
> Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is missing the forest for the trees quite badly. The 2000 price GPUs are what would've been previously 600-700, and the 200-400 dollar GPUs are now 600-700. Consumers got a shit end of the deal when crypto caused GPUs to spike and now consumers are getting another shitty deal with RAM prices. And even if you want mid range stuff it's harder and harder to buy because of how fucked the market is.
It would be like if in your example companies literally only sold F-150s and stopped selling budget models at all. There isn't even budget stock to buy.
Aerroon 1 days ago [-]
The problem is the VRAM segmentation.
A GTX 1080 came out in the first half of 2016. It had 8 GB of VRAM and cost $599 with a TDP of 180W.
A GTX 1080 Ti came out in 2017 and had 11 GB of VRAM at $799.
In 2025 you can get the RTX 5070 with 12 GB of VRAM. They say the price is $549, but good luck finding them at that price.
And the thing with VRAM is that if you run out of it then performance drops off a cliff. Nothing can make up for it without getting a higher VRAM model.
jodrellblank 21 hours ago [-]
> "They say the price is $549, but good luck finding them at that price."
I did one Google search for "rtx 5070 newegg usa" and they have MSI Ventus GeForce RTX 5070 12G down from $559 to $499 for Black Friday, and ASUS Prime RTX 5070 12GB for $543.
-the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...
-for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble
(granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).
on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.
timschmidt 2 days ago [-]
One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.
Aurornis 2 days ago [-]
> When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
codebje 1 days ago [-]
The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
Aurornis 21 hours ago [-]
> 3dfx didn't invent the voodoo for B2B sales.
3Dfx was not the inventor of the GPU. There’s a long history of GPU development for corporate applications.
The iPhone wasn’t the first mobile phone. Early mobile phones were very expensive and targeted as businesses who wanted their executives in touch
You’re still thinking from a consumer-centric view. Zoom out and those consumer companies were not the first to develop the products. You didn't even think about the actual originators of those types of products because you don’t see them as a consumer.
thaumasiotes 1 days ago [-]
> The iPhone wasn't designed or marketed to large corporations.
The iPhone isn't exactly a consumer computation device. From that perspective, it does less work at a higher cost.
codebje 1 days ago [-]
I'm sorry, but I'm not sure if you're implying you dislike Apple's approach to what the user is allowed to do, or suggesting we should only talk about general purpose computing devices. If it's the latter, sure, the iPhone's not an innovation in that space, discard it from my list of examples. If it's the former, I'll give you that too, but it was still the first of its kind, by a large margin.
(I remember the huge window in which phone companies desperately put out feature phones with sub-par touch screens, completely missing the value to consumers. The iPod Touch should've been warning enough... and should've been (one of) my signal(s) to buy Apple stock, I guess :-)
somenameforme 1 days ago [-]
Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.
Aurornis 21 hours ago [-]
> a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.
HN is strange. I have an old gaming build from 7-8 years ago and while it can do high end games on low settings and resolution, it doesn’t hold a candle to even a mid-range modern build.
“viable” is doing a lot of work in that claim. You can tolerate it at low res and settings and if you’re okay with a lot of frame rate dips, but nobody is going to mistake it for a modern build.
You’re also exaggerating how fast video cards became obsolete in the past. Many of us gamed just fine on systems that weren’t upgraded for 5-6 years at a time.
somenameforme 18 hours ago [-]
I'll take the absurd extreme end of my claim. Here [1] is a video of somebody running modern games on a GeForce GTX 1080 Ti, a card that was high end... 8 years ago. And he's doing it on high-ultra settings in 4k, and it's still doing fine. Spend a couple of hundred on a "new" video card and he'd be rocking a stable 60+FPS on everything, with some games he's still hitting that even with his card!
And back in the early 2000s, even bleeding edge current-year rigs would struggle with new games like Doom 3, Far Cry, Crysis, and so on. Hardware was advancing so rapidly that games were being built in anticipation of upcoming hardware, so you had this scenario where high end systems bought in one year would struggle with games released that year, let alone systems from 5-6 years prior.
Obviously if you're referencing CRPGs and the like, then yeah - absolutely anything could run them. The same remains even more true today. Baldur's Gate 3's minimum requirement is a GTX 970, a card more than 11 years old. Imagine a 1989 computer trying to run Baldur's Gate 2!
yes. a good reason to upgrade was PCIe 4.0 for I/O. GPU and SSD needs caused PCIe 5.0 to follow soon after.
iszomer 12 hours ago [-]
I'm still on PCIe 3.0 on my main machine and the RX580 works fine for my needs. Outside of the scope of OP, I recently picked up a (new) 5060 not due to the impending memory production apocalypse but because I wanted to extend my current setup with something I recently read about on LSFG, previously posted here but garnered no interest/comments.
> We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
Aurornis 21 hours ago [-]
> with very little gains in raster performance this decade.
I have a flagship 7-8 year old GPU in one machine and a mid-level modern GPU in another.
It’s flat out wrong to claim “very little gains” during this time. The difference between those two GPUs is huge in games. The modern GPU also does it with far less power and noise.
I can’t understand this HN mentality that modern hardware isn’t fast or that we’re not seeing gains.
HPsquared 2 days ago [-]
Gaming drove the development of GPUs which led to the current AI boom. Smartphones drove small process nodes for power efficiency.
timschmidt 2 days ago [-]
SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.
whizzter 2 days ago [-]
Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
timschmidt 2 days ago [-]
> Flight simulators just had more cash for more advanced chips
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
whizzter 21 hours ago [-]
Computers themselves were non-consumer to begin with, but the Personal Computer broke the technology moat to consumers before anything else and once that had passed it was mostly a matter of time imho.
Many 3d games like doom, quake1, flight unlimited,etc ran purely on software rendering since CPU's were already providing enough oomph to render fairly useful 3d graphics in the mid 90s. CPU power was enough but consoles/arcades showed that there was more to be gotten (but nothing hindered games at that point).
And already there, the capital investment for game consoles (Atari,NES,SNES,PS1,PS2, etc) and arcade games(like the above mentioned 3d games) were big enough to use custom chipsets not used or purposed for anything else (I think also that in the 80s/90s the barrier of entry to making competitive custom chips was a tad lower, just consider the cambrian explosions of firms during the 90s making x86 and later ARM chips).
Yes, there was vendors that focused on the high end commercial customers, and yes many alumnis of those firms did contribute a ton of expertise towards what we have today.
But if you look at what companies survived and pushed the envelope in the longer run it was almost always companies that competed in the consumer market, and it was only when those consumer chips needed even more advanced processing that we breached the point where the chips became capable of NN's.
In fact I'd say that had the likes of SGI prevailed we would've had to wait longer for our GPU revolution. Flight simulators,etc were often focused on "larger/detailed" worlds, PS2-era chips with higher polycounts and more memory would have been fine for simulator developers for a long time (since more details in a military scenario would have been fine).
Leisure games has always craved fidelity on a more "human" level, to implement "hacks" for stuff with custom dynamic lighting models, then global illumination, subsurface scattering,etc we've needed the arbitrary programmability since the raw power wasn't there (the most modern raytracing chips are _starting_ to approach that levels without too ugly hacks).
HPsquared 2 days ago [-]
It's symbiotic, I suppose.
somenameforme 1 days ago [-]
Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.
rasz 1 days ago [-]
3dfx didnt. They had a subsidiary? spinoff? Quantum3D that reused 3dfx commodity chips to build cards for simulators.
Spooky23 1 days ago [-]
100%. We’ve seen crazy swings in RAM prices before.
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)
RachelF 1 days ago [-]
Well put. Since the 1980's consumer has been driving the segment. Even supercomputers were built out of higher end consumer hardware (or playstations in one example).
The move to cloud computing and now AI mean that we're back in the mainframe days.
blablabla123 24 hours ago [-]
True, it is reminiscent of a time before me when people were lucky to have mainframe access through university. To be fair this was a long time in the making with the also quite aggressive move to cloud computing. While I don't mind having access to free AI tools, they seem to start taking possession of the content as well
Barrin92 2 days ago [-]
>We all had supercomputers in our pockets.
You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free
ManuelKiessling 1 days ago [-]
Living on the edge is expensive, and always has been.
Living on the edge from 4 years ago is basically free.
NedF 2 days ago [-]
[dead]
immibis 2 days ago [-]
It's not like you need 64GB to have "democratized computation". We used to have 64MB and that was plenty. Unfortunately, software got slower more quickly than hardware got quicker.
Aurornis 2 days ago [-]
> Unfortunately, software got slower more quickly than hardware got quicker.
Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.
The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.
Aerroon 1 days ago [-]
You brought up a light computing load that a laptop from like 2005 wouldn't struggle with?
People ran multiple browser windows, a 3D video game, irc (chat application), teamspeak/ventrilo (voice chat) and winamp (music) all at once back in the early 2000s. This is something an 8 year old phone can do these days.
Aurornis 21 hours ago [-]
I’m responding to the comment above claiming that modern software is slow on modern hardware. It’s an HN meme to claim that Electron apps are unusable.
AngryData 1 days ago [-]
It is pretty important if you are doing things like 3d animation, video editing, or advanced CAD software. Plus software in general has ballooned its memory requirements and expectations. Even my 11 year old PC had to have a RAM upgrade a few years ago just because software updates suck up so much extra memory, and there is almost nothing consumers can do about it.
ssl-3 1 days ago [-]
At any point in the the 1990s, it was generally unfathomable to be using an 11-year-old PC for any modern purpose.
That an 11-year-old PC can keep up today (with or without an upgrade) is evidence that systems are keeping up with software bloat just fine. :)
vel0city 2 days ago [-]
> We used to have 64MB and that was plenty.
Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.
immibis 17 hours ago [-]
There are some use cases that simply require a lot of memory because they do, but I'm talking in general. Software that doesn't have a good excuse, used to run in 64MB how it now runs in 64GB.
Besides, you just said you only needed 512MB, which is still nothing these days.
vel0city 14 hours ago [-]
> Besides, you just said you only needed 512MB, which is still nothing these days.
I didn't say I "only needed 512MB", only that things were a lot better once we got a 512MB machine. Things continued to get massively better as I upgraded to a 1GB machine, an 8GB machine, etc.
> I'm talking in general
Isn't doing some light picture editing/organizing, playing back multimedia, etc. pretty dang general computer use these days? Or what, is "general" computer usage entirely limited to 80 column text manipulation? You'd have a hard time even just keeping my displays drawn with 64MB of memory at the resolutions and bit depths and multiple desktops that are common.
I play around with retro computers (especially the early/mid 90s for that nostalgia) and I'm constantly reminded of how little memory we really had to play with back then, and these are on pretty much fully loaded home desktop machines. Have a Word document open and you're trying to play back an MP3 and have a couple browser windows open? Oof, good luck! You want to stream a video? I hope its about 10FPS at 320x240! Opening one photo from my camera today and you'll have used half the memory before its even hit the framebuffer.
SirFatty 2 days ago [-]
"memory manufacturers have been operating on the margins of profitability for quite a while now."
The manufacturers are scumbags is more likely answer.
I don't disagree per-se, but this is the sort of thing which happens when only a few businesses exist in a commodity market with high entry costs. IOW, it's not great, but it is predictable. See: Oil.
HPsquared 2 days ago [-]
Looking forward to the "Organization of Processor-Etching Corporations".
marcosdumay 1 days ago [-]
It's usually not only illegal, but also a crime.
Anyway, that's the kind of market that governments always need to act upon and either supply directly or regulate intensively.
venturecruelty 2 days ago [-]
It's not just predictable, it's illegal. Of course, if you have an executive that actually cares about enforcing the law.
golem14 1 days ago [-]
I wonder how long it will take for China to flood the market with state-of-the-art modules. It's a pretty decent opportunity for them. They probably can hasten the build of new fabs more than many other nations.
But my guess is that this shortage is short-lived (mostly because of the threat above). There's no OPEC for tech.
There is a zero lower bound on the interest rate. Excess capital means negative returns on capital. The money system can't express the state of the real world so either companies close down until the yield is positive, or the companies pass on the artificial minimum price onto the consumer. In both cases, the real world is forced to match the state of the money system.
Being shocked that companies try their best to deal with the bad cards they have been dealt with should be expected. The money system simply cannot express the concept of surplus capital or abundance. Positive interest means capital is scarce, so capital must be made scarce even if there is abundance.
Before you come up with the argument that the interest rate is supposed to reflect a market property and therefore does not force itself upon the market, remember that I said that there is an artificial restriction in the money system that prevents the state of the real market to be expressed. The non-profit economy has never had a chance to exist, because our tools are too crude.
The non-profit economy includes resilient production with slight/minor overproduction.
Think about how stupid the idea of a guaranteed 0% yield bond is (aka cash). The government obligates itself to accept an infinite amount of debt if the real return on capital would ever fall negative. No wonder it has an incentive to inflate the value of the bond away.
testing22321 2 days ago [-]
You mean “capitalists”.
Maximizing profit is the only sane way to play a rigged game
RobotToaster 24 hours ago [-]
For some reason it never crossed my mind that there would be futures for DRAM the same way there is for gold and silver.
Letting things go this unmanaged with a 3 year run way for AI demand seems a little hard to understand. In this case, not anticipating demand seems to creates more profit.
AngryData 1 days ago [-]
I find it hard to believe the pandemic hurt the profits of computing hardware, demand went up from it not down.
j45 1 days ago [-]
I'm not sure if profits were hurt, but the manufacturing did slow and stop and take some time to get going again.
somernd5678 2 days ago [-]
[dead]
darig 1 days ago [-]
[dead]
venturecruelty 2 days ago [-]
I guess we'll just have to stop making computer memory if it ceases to be profitable. The market is so efficient.
pmdr 2 days ago [-]
So, like, we were already pretty much priced out of higher-end graphic cards, and now it's happening to RAM. All this while jobs are disappearing, layoffs are ongoing and CEOs are touting AI's 'capabilities' left and right.
Next is probably CPUs, even if AIs don't use them that much, manufactures will shift production to something more profitable, then gouge prices so that only enterprises will pay for them.
What's next? Electricity?
Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
georgefrowny 2 days ago [-]
Maybe that is the answer to how things are supposed to work if AI replaces everyone and no one can afford to buy their stuff.
Things being too cheap allows money to pool at the bottom in little people's hands in the forms of things like "their homes" and "their computers" and "their cars".
You don't really want billions in computing hardware (say) being stashed down there in inefficient, illiquid physical form, you want it in a datacentre where it can be leveraged, traded, used as security, etc. If it has to be physically held down there, ideally it should be expensive, leased and have a short lifespan. The higher echelons seem apparently to think they can drive economic activity by cycling money at a higher level amongst themselves rather than looping in actual people.
This exact price jump seems largely like a shock rather then a slow squeeze, but I think seeing some kind of reversal of the unique 20th century "life gets better/cheaper/easier every generation".
AngryData 1 days ago [-]
I very much disagree that consumers holding more hardware capabilities than they need is a bad thing. Replace computing hardware with mechanical tools, because they are basically tools, and consider if consumers be better off if wrenches and saw blades and machine tools were held more exclusively by business and large corporations. Would corporations use them more often? Probably. And yet it seems pretty clear that it would hurt the capabilities of regular people to not be able to fix things themselves or innovate outside of a corporate owned lab.
To me the #1 most important factor in a maintaining a prosperous and modern society is common access to tools by the masses, and computing hardware is just the latest set of tools.
pjc50 1 days ago [-]
> And yet it seems pretty clear that it would hurt the capabilities of regular people to not be able to fix things themselves
Yes, that's the point. People fixing things themselves doesn't make the line go up, therefore it will be made harder.
showerst 1 days ago [-]
I think you missed some sarcasm there.
loudandskittish 1 days ago [-]
Until I got the last part, I actually thought you were being serious.
georgefrowny 21 hours ago [-]
The important thing is that the people who actually do matter here are being very serious.
And I assume some of them read these threads, so my advice to them would be to remember that the bunker air vents will probably be the main weak point.
btbuildem 1 days ago [-]
> Where the f*k is all the abundance that AI was supposed to bring into the world?
In the hands of the owners of the AI, as a direct consequence of the economic system. It was never going to play out any other way.
muldvarp 1 days ago [-]
Yeah, I'm always confused why programmers seem to like this technology given the vast negative consequences it will likely have for us. The upsides on the other hand seem to be the most insignificant things.
nickpp 1 days ago [-]
> upsides on the other hand seem to be the most insignificant things
An abundance of intelligence on Earth with all its spoils: new medicine, energy, materials, technologies and new understandings and breakthroughs - these seem quite significant to me.
muldvarp 1 days ago [-]
There is absolutely no guarantee that those things will happen just because Claude takes your job. Taking your job doesn't require super-intelligence, it doesn't even require human-level intelligence. It requires just enough intelligence to pump out mediocre code that sort of works and being way cheaper to run than your pay.
Super-intelligence is a completely different can of worms. But I'm not optimistic about super-intelligence either. It seems super naive to me to assume that the spoils of super-intelligence will be shared with the people who no longer can bring anything to the table. You aren't worth anything to the super-rich unless you can do something for them which the super-intelligence can't do.
nickpp 24 hours ago [-]
There is absolutely no guarantee that Claude takes your job either. But if you believe so much in AI, investing in it is accessible to pretty much any pocket, you don't have to be rich to partake.
And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on my phone - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
muldvarp 24 hours ago [-]
> you don't have to be rich to partake.
You still need to be rich to partake. Most business ventures will still require capital even in the age of super-intelligence. Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
> And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
There are plenty of examples of rich people hoarding their wealth. Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth. There is little reason why super-intelligence will not lead to a resource curse where the resource is human intelligence or even human labor.
> Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on a website - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
Those rich people didn't share with you out of the goodness of their heart but because it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
nickpp 16 hours ago [-]
> You still need to be rich to partake.
Again, you can invest, today, in AI stocks and ETFs, with just $100 and a Robinhood account. No need to be rich.
> Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
> Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth.
Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
> it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services? Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
muldvarp 14 hours ago [-]
> Again, you can invest, today, in AI stocks and ETFs, with just $100 and a Robinhood account. No need to be rich.
And then? I'll compensate the loss of thousands of dollars I don't earn anymore every month with the profits of a $100 investment in some ETF?
> If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
Property has inherent value. A house I can live in. A farm can feed me. A golf course I can play golf on. These things have value even if nobody can buy them off me (because they don't have anything I want). Supply and demand determine only the _price_ not the _value_ of goods and services.
> Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
None of this has any connection to anything I've written. I'm talking about the concept of a resource curse. Countries rich in natural resources (oil, diamonds, ...) where the population is poor as dirt because the ruling class has no incentive to share any of the profits. The same can happen with AI if we don't do anything about it.
> How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services?
Other rich people can buy their products and services. They don't need you to buy their products and services because you don't bring anything to the table because all you have is labor and labor isn't worth anything (or at least not enough to survive off it). Put differently: Why do you think rich people would like to buy your labor if using AI/robots is cheaper? What reason would they have to do that?
> Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
You're proving my point: Elon still lives in a world where labor is worth something. Because Elon lives in a world where labor is worth something it is in his interest that there are many people capable of providing that labor to him. This means it is in his interest that the general population has access to food and water, is well eduacated, ...
If Elon were to live in a world where labor is done by AI/robots there would be little reason for him to care. Yes, he couldn't sell his cars to the average person anymore, but he wouldn't want to anyway. He could still sell his cars to Altman in exchange for an LLM that strokes his ego or whatever rich people want.
The point is: Because rich and powerful people still have to pay for labor, their incentives are at least somewhat aligned with the incentives of the average person.
ikety 18 hours ago [-]
I feel like you're fighting the fallacy of "the rich" being collectively blamed for every problem, by giving them credit for everything instead.
We know that none of the goods you listed would be available to the masses unless there was profit to be gained from them. That's the point.
I have a hard time believing a large group being motivated and mutually benefiting towards progression of x thing would result in worse outcomes than a few doing so. We just have never had an economic system that could offer that, so you assume the greedy motivations of a few is the only path towards progress.
nickpp 16 hours ago [-]
> We just have never had an economic system that could offer that
Please propose it yourself.
> you assume the greedy motivations of a few is the only path towards progress
No. I assume the greedy motivations of the many is the best path towards progress. Any other attempts to replace this failed miserably. Ignoring human nature in ideologies never works.
FuckButtons 18 hours ago [-]
Literally none of what you just said is true. All of those things happened because there was a market opportunity, there was a market opportunity because wealth was not just in the hands of the rich.
If you want to look at what historically has happened when the rich have had a sudden rapid increase in intelligence and labor, we have examples.
After the end of the Punic wars, the influx of slave labor and diminution of economic power of normal Roman citizens lead to: accelerating concentration of wealth, civil war and an empire where the value of human life was so low that people were murdered in public for entertainment.
nickpp 16 hours ago [-]
> All of those things happened because there was a market opportunity, there was a market opportunity because wealth was not just in the hands of the rich.
Yet those things did not happen in communist countries (or happened way less in socialist ones), during the same time period, even though the market was there too. That is why EU's socialist countries consume high tech products and services from the USA and not the other way around.
Refreeze5224 1 days ago [-]
Ding ding ding. What a surprise that a system designed not for human flourishing but pure profit would actually deliver massive profit with no regard for human flourishing.
Humanity will have to adopt new human-focused modes of living and organizing society, or else. And climate change is coming along to make sure the owning class can't ignore this fact any longer.
nickpp 1 days ago [-]
> a system designed not for human flourishing but pure profit
But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Because I grew up under communism and I lived its miserable failures: the non-profit system didn't even manage to feed, cloth or warm/cool us.
> new human-focused modes of living and organizing society
Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
22 hours ago [-]
brooke2k 21 hours ago [-]
It's depressing how in the modern day you can't criticize capitalism without immediately being told that you must be a supporter of soviet-style authoritarian socialism
There are shades of grey here. Capitalism is a system with many inherent problems. Exploring alternatives is not the same thing as being a Stalinist
Refreeze5224 19 hours ago [-]
This exactly. Capitalist propaganda likes to paint anything other than capitalism as Stalinist authoritarian communism, which should be abhorred as well as capitalism, and just for the same reason: both are coercive, hierarchical, and unfree.
nickpp 1 hours ago [-]
Because every time I encounter such capitalism haters they turn out to be marxists in disguise. Usually pushed and promoted by people that never lived outside of their comfortable capitalist wealth bubble, written on capitalist devices, here, on this very venture capitalist's forum! The hypocrisy boggles the mind, really.
It's like the lack the most basic understanding of economics and they never read any history. I mean, communism has failed everywhere it was tried and there were so many A/B test that plainly show each system's results: North vs South Korea, Eastern Europe before vs after 1990, USA vs USSR, Argentina during the last hundred years, Venezuela before and after Chavez, etc.
Or they push socialism under new names ("democratic") as if it's a new thing, not just a watered down form of communism, with authoritarian communism being the logical end game of socialism - because "at some point you run out of other people's money" and you need force to keep fleecing them. Just like it happened in Venezuela...
Anamon 22 hours ago [-]
They didn't say communism was the only other option; this seems like a bad faith reply.
Capitalism increasingly fails to provide well-being to the majority of the global population. It's obvious we need to come up with something else, even if it's not clear yet what shape that will take.
If we can't find an alternative that works, we can also just wind down humanity, and not much of value to the universe will be lost :)
Cthulhu_ 21 hours ago [-]
But there's solutions proposed all the time; tax the rich, tax the corporations, and use that money for socialist policies, like the ones being dismantled by the US right now. Regulate the biggest expenses like health care, housing and energy, restrict how much datacenters can use as their consumption drives up the prices, etc.
You don't need to go full communist to make things better.
nickpp 15 hours ago [-]
And we've seen where those "solutions" lead to. The EU chose to embrace this watered-down communism experiment and we can plainly see the results today: the euro-poor are badly lagging the USA. Preciously few major businesses created in the last 20 years, almost zero new products and services being introduced here. China has taken the manufacturing lead, US the innovation one.
We have to go cry to "daddy Trump" for protection, unable to even defend ourselves when a greedy, blood-thirsty country with an economy smaller than Italy's decided to attack.
Regulating health care made medical research escape to USA. Regulating building created the biggest housing crisis affecting young couples most, in turn reducing natality even further. Regulating energy pushed us in the warm embrace of the Russian bear who exploited our dependency to the max. GDPR ensured no web-based EU startup can be competitive to the US-ones. Regulating AI was just funny at that point since it is all made in China and the USA anyway...
Yeah, you don't need to go full communist to f things up but the closer you go, the worse things get.
Refreeze5224 20 hours ago [-]
I appreciate the lack of sarcasm in your reply. And echo the other reply's point about not being able to criticism the system that is literally destroying the planet and in by design totally unsustainable...
> But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Libertarian socialism, anarchocommunism, any system where human freedom is the basis, and not coercion or hierarchy. This stuff is not new or radical, it's just not favored by people with lots of money to lose.
> Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
Lucky for you, no. The complete opposite. Freedom of association is the entire foundation of it. We all get to associate with whomever we want, when and for as long as we want. Someone being a condescending prick in your local comment section? You get to ignore them! No commissars or cops or Party. Someone wants to go play hierarchical capitalism with his friends? As long as he's not messing with other people or contravening their rights, they get to do whatever they want.
Will any of these systems result in 99 cent stores, fast food restaurants, or people on the moon? Almost definitely not. But those are all irrelevant to creating a sustainable environment designed for human beings, and not profit.
The lack of innovation (or even reading of basic history...) in what is possible in terms of organizing human societies is frankly sad, especially among tech workers. Most people are too influenced by capitalism (intentionally so) to believe that how things are now is the only way they can be. There is so little scope for innovation and change, and that starts with the owning class who have no interest in it changing.
nickpp 1 days ago [-]
> In the hands of the owners of the AI
With a hundred bucks and a Robinhood account, you too can be part of this greedy, evil and mysterious "owners of AI" class and (maybe) some day enjoy the promised spoils.
Oh the wonders of Capitalism, the economic system offering unequal abundance to everyone caring to take part... Where are the other, much touted systems, masters at spreading misery equally?
How much is due to long overdue infrastructure upgrades and greed by providers, vs the cost of energy?
Also, consumer prices _have_ risen (mine included), but it's not clear that this is only because AI. While EV charging is not at the scale of all data centers combined, it seems to grow even faster than the datacenter's consumption, and is expected to eclipse the latter around 2030. Maybe sooner due to missing solar incentives.
Also, to rant on:
According to [1], an average Gemini query costs about 0.01 cents (Figure 2 - say 6000 queries per kWh at 60 cents/kWh, which is probably more than the industrial consumers pay). The same paper says one other providers is not off by that much. I dare say that at least for me, I definitely save a lot of time and effort with these queries than I'd traditionally have to (go to library, manually find sources on the web, etc), so arguably, responsibly used, AI is really quite environmentally friendly.
Finally: Large data centers and their load is actually a bit fungible, so they can be used to stabilize the grid, as described in [2].
I would think it would be best if there were more transparency on where the costs come from and how they can be externalized fairly. To give one instance, Tesla could easily [3] change their software to monitor global grid status and adjust charging rates. Did it happen ? Not that I know. That could have a huge effect on grid stability. With PowerShare, I understand that vehicles can also send energy back to power the house - hence, also offload the grid.
> How much is due to long overdue infrastructure upgrades and greed by providers, vs the cost of energy?
This only makes sense if you ignore profits. We've been paying the bills since before this was "overdue"; for instance i am still paying a storm recovery surcharge on my electric bill from before i ever moved to this state. At the point where a "temporary infrastructure surcharge for repairs" becomes a line item on their profit statement, that's where i start to get real annoyed.
Our electric company has 287,000 customers and has a market cap of >$800,000,000
what percentage of that eight tenths of a billion in market cap came from nickel and diming me?
* note: nickel and dime was established as "an insignificant amount of money" in the 1890s, where sirloin, 20% fat was $0.20 a pound. That's $13.50 now (local); chuck $0.10 and $19 now. So somewhere between 67 and 180 times less buying power from a nickel and dime, now. Also that means that, y'know, my surcharges being $15-$30 a month is historically "nickels and dimes"
In my personal case, it was cost of energy due to the Russian invasion of Ukraine triggering a stop on buying cheap gas from them. Infrastructure upgrades are also being done, costing tens of millions per year because the grid can't handle the sudden increase in renewable energy generation and electrification (to ironically move away from dependency on gas).
I mean part of me thinks it's a necessary evil because we relied too much on Russian gas in the first place. But that's because we extracted most of our own gas already (https://en.wikipedia.org/wiki/Groningen_gas_field), which that article lists as one of the factors in the Dutch welfare state being a thing - it and smaller fields out at sea contributed over 400 billion to the Dutch economy since the 1950's.
more_corn 1 days ago [-]
Rooftop solar doesn’t get more expensive over time.
entropi 23 hours ago [-]
Roofs do, though.
Cthulhu_ 21 hours ago [-]
> What's next? Electricity?
That and water. Electricity: Google made a post about scaling k8s to 135.000 nodes yesterday, mentioning how each node has multiple GPUs taking up 2700 watts max.
Water, well this is a personal beef, but Microsoft built a datacenter which used potable / drinking water for backup cooling, using up millions of liters during a warm summer. They treat the water and dump it in the river again. This was in 2021, I can imagine it's only gotten worse again: https://www.aquatechtrade.com/news/industrial-water/microsof...
mleonhard 7 hours ago [-]
Is any datacenter's water use significant compared to other industrial installations? According to that article, all datacenters in North Holland use 550 Ml/yr. North Holland has 2.95M residents [0], who use 129 l/person-day [1], 47 Kl/person-year, 139,000 Ml/year for the whole region. So the data centers use an estimated 0.4% of the region's water. Data centers use about 3% of the Netherlands' electricity.
Why do you think this is a lot of water? What are the alternatives to pulling from the local water utility and are those alternatives preferable?
In the share prices. Hope you're rich, because that's the only thing the economy cares about.
charcircuit 1 days ago [-]
Many apps support buying fractional shares. You don't have to be rich to buy shares in public companies.
phil21 1 days ago [-]
It's only meaningful if you have enough disposable income to invest that it eventually (and I don't mean in 50 years) makes a dent against your living expenses.
If you make $4k/mo and rent is $3k, it's pretty silly to state that it's a meaningful thing for someone to scrimp and invest $100/mo into a brokerage account.
They definitely should do this, but it's not going to have any meaningful impact on their life for decades at best. Save for a decade to get $12k in your brokerage account, say it doubles to $24k. If you then decide you can get a generous 5% withdrawal rate you are talking $600/yr against rent that is now probably $3500/mo or more. Plus you're killing your compounding.
It's good to have so emergencies don't sink you - but it's really an annoying talking point I hear a lot lately. Eye rolling when you are telling someone struggling this sort of thing.
It really only makes a major impact if you can dump large amounts of cash into an account early in life - or if you run into a windfall.
neogodless 23 hours ago [-]
If you contribute $1k per month for 10 years earning 6% you would have $162k.
5% of that would be $8100.
Is $48k / year a typical income?
genewitch 22 hours ago [-]
where are you getting $1k/month? GP said "invest" $100 per month given a $48,000 income and rent of ~$36k/yr.
Yeah obviously if i can sock away $12,000 a year for 10 years i'll have money. Just be aware that i was in a bunch of funds from 2010-2020 and were it not for what happened at the end of that decade i wouldn't have made any additional money at all. In fact, i would have lost a decent chunk of money - not just in fees, but inflation.
Also, where are you guaranteed 6% for a decade? t-bills or something?
Cthulhu_ 21 hours ago [-]
Who in this economy has $1K left over at the end of the month to spend / risk freely?
iszomer 12 hours ago [-]
You'd be surprised.
smallmancontrov 22 hours ago [-]
The pauper who invests peanuts and receives peanuts isn't the problem. The professional who invests modest sums and boosts their retirement isn't the problem.
The centabillionaire who invests their fortune and receives $50M a day from the market and cares about nothing more than keeping the gravy train going is the problem. They head to Washington and splash their free money hose around in exchange for political support to keep that hose pumping at all costs. That's why politicians were so lock-step about how it was super important for the US to sell our industry to China and drop capital gains tax below income tax and open the Double Irish Sandwich and now they're getting rid of social programs and instituting 50 year mortgages and so on.
The fact that the guy with the firehose can pump the firehose by stepping on the guy investing $1k month is the core of the problem. Until you are at escape velocity -- your net worth is enough to cover your lifestyle from investment returns alone -- you lose by default.
charcircuit 1 days ago [-]
Rent should not be more than 1/3 of your income so moving to an appropriate place will let that person save an extra 2k a month. It would take a year instead of a decade to have invested 24k.
igleria 1 days ago [-]
> moving to an appropriate place
like under a bridge or something? Pardon the hyperbole, but you would have to assume people with no disposable income are idiots in order to suggest that solution.
charcircuit 1 days ago [-]
Or at their parent's house or going to find roommates. If you can't afford to move out by yourself than you shouldn't do it if you want to be financially responsible.
lp4v4n 23 hours ago [-]
You look like the same kind of person who will complain that people won't have children nowadays.
loeg 18 hours ago [-]
Maybe not idiots, but many humans are definitely not homo economicus rational actors.
m_rpn 1 days ago [-]
Rent should not be more than 1/3 of your income but to get an income you usually need to be in a place where rents are more than 1/3 of such income.
array_key_first 22 hours ago [-]
The lower COL areas are lower because they're less economically viable, and therefore less desirable. The job opportunities and income will, more or less, match the lower COL.
Theres some exceptions, like rural doctors can make more more than city doctors due to high demand. But the less "physical" your job is, the rarer these exceptions become.
For software devs, you can move out of silicon valley. Maybe to Texas. And now those 1.5 million dollar homes are only 700,000 dollars. But your salary will reflect that.
loeg 1 days ago [-]
It's not like the individual share price of e.g. VTI or FZROX is all that high to begin with.
muldvarp 1 days ago [-]
You thought that abundance would trickle down into the pockets of people that have to work for a living?
AI will lead to abundance. For those that own stuff and no longer have to pay for other people to work for them.
nickpp 1 days ago [-]
> abundance. For those that own stuff and no longer have to pay for other people to work for them.
Why are you saying that? Anybody working for a living (but saving money) can invest in AI stocks or ETFs and partake in that potential abundance. Robinhood accounts are free for all.
Cthulhu_ 21 hours ago [-]
> Anybody working for a living
> (but saving money)
These two are already difficult or impossible for many people. Especially a big chunk of USAmericans have been living paycheck to paycheck for a long time now, often taking multiple jobs just to make ends meet.
And then to gamble it on a speculative market, whose value does not correlate with its performance (see e.g. Tesla, compare its sales / market share with its market value compared to other car manufacturers). That's bad advice. And for an individual, you'd only benefit a fraction from what the big shareholders etc earn. Investing is a secondary market.
nickpp 15 hours ago [-]
> a big chunk of USAmericans have been living paycheck to paycheck
That doesn't mean they are poor, just poor with their money. Saving is a skill that needs to be learned.
> That's bad advice.
No. Not investing, when the S&P500 index had a 6% inflation-adjusted annual historical return over the last 100 years - is bad advice. Not hedging the arrival of an AGI that you think can replace you - is bad advice.
47282847 1 hours ago [-]
It’s not an utopian idea to try as a society to hedge and distribute returns to all members. From a human resource perspective, it is insane (ie. costly and inefficient) to put it on the shoulders of each individual. But that’s where we are (almost; happy to live in a country with still some public services like unified healthcare and retirement funds).
Yes, you can drill your own well to have water in a society, for some. Or, you come up with the unheard-of idea of public utilities, so people can simply open a tap and enjoy. In some regions, even drink it. Personally, growing up in a lucky place like that, I have a hard time imagining to ever live in a place that required me to buy bottled water.
Yes, you can demand each member of society to learn about ETFs. Personally, I enjoy every part of life where complexity is being dealt with for me; I wouldn’t survive a day without that collaboration.
We have a choice to design society, like we have a choice to design computer systems.
Anamon 22 hours ago [-]
...which almost everybody agrees is a bubble, wondering not if, but when it will burst.
Investing in AI companies is just about the last piece of advice I'd give someone who's struggling financially.
The billionaires will largely be fine. They hedge their bets and have plenty of spare assets on the side. Little guy investors? Not so much. They'll go from worrying about their retirement plan to worrying about becoming homeless.
EliteGadget 2 days ago [-]
Ram is a commodity they have spikes in demand/shortage of supply which drive the price up like this.
I remember when there was a flood somewhere in Thailand in the 2011 and the prices of hardisks went up through the roof.
The abundance is there, it just isn't for you or other working class people
tapoxi 2 days ago [-]
Electricity prices are already skyrocketing, it's present - not next.
imglorp 2 days ago [-]
Yes electricity by more short-sighted, dirty methods (remember when crypto clowns bought an old fossil plant just for coin mining?) but more alarming, fresh water.
That can be a bigger problem for civilization.
2 days ago [-]
theandrewbailey 1 days ago [-]
Drive prices have already exploded. New hard drives have doubled in price since the beginning of the year. I haven't checked SSD prices, but why would they not be crazy, too?
The AI bubble has also pushed up secondhand prices. I work in ewaste recycling. Two weeks ago, a petabyte's worth of hard drives showed up. After erasing, testing, listing, and selling them, I'll have a very merry Christmas.
jayd16 2 days ago [-]
>Where the f*k is all the abundance that AI was supposed to bring into the world?
That'll come with the bubble bursting and the mass sell off.
igleria 1 days ago [-]
Given that everyone already answered what I had in mind, what can we do about it? What happened in the past where we might get answers from?
balls187 1 days ago [-]
I picked a really great time to give up PC Gaming for writing.
dannyobrien 1 days ago [-]
After the supply constraints of the post-covid period, are graphics cards really that more expensive?
How long do you estimate this period of supply constraint will be? Will manufacturers continue to be greedy, or will they grow less greedy as the supply improves based on the price signals the high price indicates?
pfannkuchen 1 days ago [-]
Sounds like a business opportunity for someone to come in and scale better.
thegrey_one 1 days ago [-]
That's what happens when demand is higher than supply.
"Where the f*k is all the abundance that AI was supposed to bring into the world?"
more money for shareholder, 5 Trillion Nvidia???? more like a quadrillion for nvidia market cap
kmeisthax 2 days ago [-]
In world history, the vast majority of abundance is downstream of conquest, not innovation. Plunder makes profit. Even in weird moments like today, where innovation is (or, at least, was) genuinely the driving force of abundance, that innovation would not have come about without the seed capital of Europe plundering Africa and the Americas.
Abundance isn't even the right framing. What most people actually want and need is a certain amount of resources - after which their needs are satiated and they move onto other endeavors. It's the elites that want abundance - i.e. infinite growth forever. The history of early agriculture is marked by hunter-gatherers outgrowing their natural limits, transitioning to farming, and then people figuring out that it's really fucking easy to just steal what others grow. Abundance came from making farmers overproduce to feed an unproductive elite. Subsistence farming gave way to farming practices that overtaxed the soil or risked crop failure.
The history of technology had, up until recently, bucked this trend. Computers got better and cheaper every 18 months because we had the time and money to exploit electricity and lithography to produce smaller computers that used less energy. This is abundance from innovation. The problem is, most people don't want abundance; the most gluttonous need for computational power can be satisfied with a $5000 gaming rig. So the tech industry has been dealing with declining demand, first with personal computers and then with smartphones.
AI fixes this problem, by being an endless demand for more and more compute with the economic returns to show for it. When AI people were talking about abundance, they were primarily telling their shareholders: We will build a machine that will make us kings of the new economy, and your equity shares will grant you seats in the new nobility. In this new economy, labor doesn't matter. We can automate away the entire working and middle classes, up to and including letting the new nobles hunt them down from helicopters for sport.
Ok, that's hyperbole. But assuming the AI bubble doesn't pop, I will agree that affordable CPUs are next on the chopping block. If that happens, modular / open computing is dead. The least restrictive computing environment normal people can afford will be a Macbook, solely because Apple has so much market power from iPhones that they can afford to keep the Mac around for vanity. We will get the dystopia RMS warned about, not from despotic control over computing, but from the fact that nobody will be able to afford to own their own computer anymore. Because abundance is very, very expensive.
> Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
It may have been a bit self-deprecating, but I think your “rant” is a more than justified question that really should be expanded well beyond just this matter. It’s related to a clear fraud that has been perpetrated upon the people of the western world in particular for many decades and generations now in many different ways. We have been told for decades and generations that “we have to plunder your money and debase of and give it to the rich that caused the {insert disaster caused by the ruling class} misery and we have to do it without any kind of consequences for the perpetrators and no, you don’t get any kind of ownership or investment and we have to do it now or the world will end”
zozbot234 2 days ago [-]
Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
csdreamer7 1 days ago [-]
> Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
What is your source on that? Moore's Law is Dead directly contradicts your claims by saying that OpenAI has purchased unfinished wafers to squeeze the market.
Note the consistent "up to 60% since September" figure in the above recent reports. That's for one module capacity, with others being up 30% to 50% - and it certainly isn't the 200% or more we're apparently seeing now in the retail market. That's pure panic hoarding, which is actually a very common overreaction to a sudden price spike.
PunchyHamster 2 days ago [-]
"only" ? Nobody is hoarding RAM (at least yet, consumers seem mostly blindsided by it), this is directly caused by industry thirst for AI
Hardware is faster, but the "abstraction tax" is higher than ever.
As someone currently fighting to shave megabytes off a C++ engine, it hurts my soul to see a simple chat app (Electron) consume 800MB just to idle. We spent the last decade using Moore's Law to subsidize lazy garbage collection, five layers of virtualization, and shipping entire web browsers as application runtimes. The computer is fast, but the software is drowning it.
librasteve 2 days ago [-]
I used to sell 64kbit (yes, bit) DRAM at $7 in 1982. 1 year later was <$0.50.
The memory business is a pure commodity and brutally cyclic. Big profit => build a fab => wait 2 years => oh shit, everyone else did it => dump units at below cost. Repeat.
loloquwowndueo 2 days ago [-]
Then you have “acts of God” like that time when a flood or fire or something caused the only factory in the world that produced some obscure part of memory chips to stop production and memory costs double almost overnight. I remember my 4 -> 32 MB back in the 90s cost a fortune because of this.
Old enough here to remember Intel exiting the DRAM business.
timschmidt 2 days ago [-]
Same! And then they made new eDRAM for a hot minute as part of Crystal Well. It'd be fun to see them get back into the game in a serious way, but their on-again-off-again attitude toward dGPUs does not give me confidence in their ability to execute on such long-term plans.
kabdib 2 days ago [-]
Old enough here to remember Intel entering the DRAM business :-)
merelysounds 2 days ago [-]
In case anyone else wanted to check, PS5 has[1]:
> Memory: 16 GB GDDR6 SDRAM
So unless the RAM price jumps to 4x the price of a PS5, getting a PS5 is not the most cost efficient way to get to 64 GB of RAM.
In comparison, PS3 has been used to build cheap clusters[2].
Yes, stupid comparison really. Also 64GB is pretty high-end from a consumer perspective. Most would do just fine with 32 as 2x16GB.
AngryData 1 days ago [-]
Maybe if they expect to upgrade within a few years it would be fine. But when I built my current computer 11 years ago I also didn't expect to need 16 gb of ram and only bought 8. 5 years later 16 gb of memory was a requirement for both software and games I was playing. And now 11 years later 16 gigs is not enough for fairly "simple" 3d modelling and 32 gigs is pretty close to the minimum requirement to fully utilize other modern hardware.
kristianp 1 days ago [-]
I bought a 16gb desktop for work in 2011, plenty for Visual Studio at the time. 8 is a bit skimpy for a desktop build even in 2014.
crims0n 1 days ago [-]
Speaking of 11 years old, I just put my 4790k out to pasture. It was a good CPU for a long time, but it got a little long in the tooth for modern workloads.
fzeroracer 1 days ago [-]
I bought 2x16gb for my home computer at $90 about three months ago. When I checked the price of the exact thing I bought just in the past day, it's now $270. The price increase is across the board whether it's a low end or high end build.
pseudosavant 2 days ago [-]
Built my son's first gaming PC 2 months ago. Figured it would be cheaper around Black Friday, but the prices were reasonable enough that we didn't wait. Turned out to be a huge savings to buy that fast DDR5 in September.
EddieB 2 days ago [-]
Just went through this today for my daughter- struggled to find an i5 not on pre-order, and the RAM was crippling- ended up going Ryzen 7 for a first time and 2x8Gb DDR5 6000 @ £119 - looking forward to building it with her!
Anamon 22 hours ago [-]
Ryzen is the better choice anyway :)
Props on building a PC with your kid, I have very fond memories of doing that with my dad. Have fun!
f0rgot 1 days ago [-]
This is really crazy. I built my first computer not too long ago; like I'm talking less than a month maybe, definitely less than 2. I paid $320 for 64GB Kit (CMK64GX5M2B5600C40) at Microcenter. It is now sold out in Chicago store and listed at $530.
f0rgot 1 days ago [-]
$760 on NewEgg! Yeeezus Christ. Literally 50% of what I spent on the entire build just on RAM.
alias301 1 days ago [-]
I bought that exact model on 11/1/2024 @ $164.99
NekkoDroid 1 days ago [-]
Luckily I bought my extra 32G of DDR4 (now have 64G) used a while ago, only paid like 80€ for it. I remember back in like 2018 or so when I originally build this PC I got 4x4G DDR4 for like 160€ when prices were also crazy.
cwbriscoe 2 days ago [-]
I wish I would have done what you did. Especially since I wanted 128GB. Now I am probably going to settle for 64GB or maybe 96GB.
venturecruelty 2 days ago [-]
The better play would've been to buy Bay Area real estate in the 1970s, but what're you gonna do? lights cigarette
djvdq 23 hours ago [-]
I wanted to build a gaming PC around summer, to be able play with my son, but I postponed it for no real reason. I built this PC a 2 weeks ago, so instead of paying 250 pln (~$70 usd) for 32 GB RAM, I paid 899 pln (~$250). Now, exactly the same RAM costs 1099 pln (~$300).
mikesickler 2 days ago [-]
I bought 32GB x 2 of G.SKILL a year ago and paid $204. Now it's $600. Insane
bpye 2 days ago [-]
Just checked my invoice from last year, $316 CAD for 2x32GB 5600MHz DDR5 ECC UDIMMs.
I'm now seeing $480 CAD for a single stick.
wiredfool 2 days ago [-]
I just got one of the last beelink ser-8s with 64gb for $750. They sold out by the time my order arrived. The newer ones are starting around 830 for a 32gb machine (admittedly with newer everything)
vkou 2 days ago [-]
I bought 64GB of DDR5 two weeks ago. That same RAM is now twice the price.
theginger 2 days ago [-]
When did a PS5 become a unit of cost?
For reference seems to be about 0.002 London buses
jkingsman 2 days ago [-]
I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
bhelkey 2 days ago [-]
> I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
The PS5 has 16GB of RAM. One can buy 16GB of RAM for ~$100 [1].
Thank you for mentioning this. Not knowing the specs of a PS5, I'd assumed that the comparison was made because the PS5 now sold for less than the RAM it contains, and scalpers were now hungrily buying up all available PlayStation inventory just to tear them open and feast on the RAM inside.
But since it's 16 GB, the comparison doesn't really make sense.
hsbauauvhabzb 1 days ago [-]
But that’s not apples to apples, a computer will generally need more ram to compete with a console.
addandsubtract 1 days ago [-]
The PS5 also has GDDR6 RAM, compared to the DDR5 in the link.
AuthAuth 2 days ago [-]
It still is a rule of thumb, you dont need DDR5 for a gaming computer let alone 64gb. A low end am4 cpu + 16gb of DD4 3600 and a decent gpu will beat a ps5 in performance and cost. I dont understand why the headline made this strange comparison.
nottorp 1 days ago [-]
The thing is, on a ps5 you just open the game and it runs fine.
On a PC you may have the bright idea to open a browser along with the game for walkthroughs/hints. Or Discord to chat with your friends while gaming.
Due to javascript bloat, your working set size goes from 16 to 48-64 Gb in a jiffy.
kaszanka 1 days ago [-]
That's still not a fair comparison, because on a console you don't have the option to do any of that.
dlock17 19 hours ago [-]
It is a pretty fair comparison.
You do have the option to open up Discord voice chats on PS5. Amazing what Discord could do when forced to actually write something efficient.
Youtube also exists as an app, and maybe you can trick the heavily gimped built in browser to go there as well, although last I checked it wasn't trivial.
kaszanka 10 hours ago [-]
TIL! That's neat, I wonder how much RAM that client uses compared to the desktop one.
nottorp 23 hours ago [-]
It kind of is, because if you use a PC like a console 16 Gb is enough. If you use a PC like a PC it's not.
2 days ago [-]
keyringlight 2 days ago [-]
It doesn't help that GPUs have also generally gone up over the past decade because there's more market for them besides gaming, along with how they benefit from being hugely parallel and the larger you can make them the better, and fabrication costs are shooting up. I think there was a GamersNexus video at the launch of one of the previous GPU generations that noted that there was a move from "more for your money" each generation towards "more for more", i.e. keeping the value roughly static and increasing the amount they charged for a more capable product.
Krutonium 2 days ago [-]
To be fair, if this keeps up, expect the price of a PS5 to skyrocket too.
Hamuko 2 days ago [-]
Hopefully Sony has long-term contracts for their components. I presume they have an idea of how many PS5s they're going to be making still.
nrhrjrjrjtntbt 1 days ago [-]
All that but they can still jack up the price cus why not.
gishh 17 hours ago [-]
Because, unless this changed, consoles are loss-leaders. At least back in the ps2/gamecube/OG Xbox, the systems were sold at a loss and the money was recouped on controllers and games.
Can’t use a ps2 controller to play a ps2 game on a ps2 without the ps2 console.
If this is still true or not, I don’t know. I do know that the ps5 with an optical drive cost $100 more than the digital edition. I also know that the drive does not cost $100 and sincerely doubt the labor makes up the difference.
So maybe I talked myself out of my whole point.
Yokolos 2 days ago [-]
Give it a month or two and it might be cheaper to get the bus.
zorked 2 days ago [-]
It's a more stable unit than US dollars.
Aurornis 2 days ago [-]
> When did a PS5 become a unit of cost? For reference seems to be about 0.002 London buses
Gaming consoles are something people buy. Any parent or gamer has an idea what they cost.
People do not buy London Buses themself
silisili 2 days ago [-]
Seems like an American thing. We measure distances in football fields and volumes in olympic pools, seems we now measure money in PS5s. It tracks...
icehawk 1 days ago [-]
Never.
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
hn92726819 2 days ago [-]
Approximately the instant when a single component (RAM) of a comparable product (Gaming PC) became more expensive than the entirety of said product.
I wonder what you'd think if bus tires exploded in price and started costing .25 London busses per tire.
Lots of people are speculating that the price spike is AI related. But it might be more mundane:
I'd bet that a good chunk of the apparently sudden demand spike could be last month's Microsoft Windows 10 end-of-support finally happening, pushing companies and individuals to replace many years worth of older laptops and desktops all at once.
Marsymars 7 hours ago [-]
I have no idea about the number of people this has actually affected, but this is exactly my situation. Need a new workstation with a bunch of RAM to replace my Win10 machine, so I don't really have viable options than paying the going rate.
ProllyInfamous 2 days ago [-]
I worked in enterprise laptop repair two decades ago — I like your theory (and there's definitely meat there) but my experience was that if a system's OEM configuration wasn't enough to run modern software, we'd replace the entire system (to avoid bottlenecks elsewhere in the architecture).
RachelF 1 days ago [-]
Perhaps the memory manufacturers have seen how much Apple gets away with charging for the memory on their laptops and have decided to copy them ;-)
ErneX 1 days ago [-]
It’s not speculation, but it could also be both.
Someone1234 2 days ago [-]
It will be interesting to see the knock on effect of some upcoming consumer electronics; for example Apple was rumored to be working on a cheaper MacBook that uses an iPad CPU, and Valve is working on a SteamOS based gaming machine. Both will likely live/die based on price.
Aurornis 2 days ago [-]
It's way too early to assume these prices are permanent. It's a supply crunch meeting a demand spike. The market will find equilibrium.
Big manufacturers also order their DRAM in advance with contractually negotiated pricing. They're not paying these spot market prices for every computer they ship.
dsiddharth 2 days ago [-]
edit: looks like i had the wrong understanding, thanks to the comments below for explaining
~~~~~helps that Apple's SoC has the RAM on the main die itself. They're probably immune from these price hikes, but a lot of the PC/Windows vendors would, which would only make Apple's position even stronger~~~~
bryanlarsen 2 days ago [-]
They're probably immune for a while because they're probably using a long term contract, but when it comes time to renew they'll have to offer close to market price to convince the manufacturers not to use that fab space for more profitable memory.
arjie 2 days ago [-]
How does that make a difference? It's not like the price change is on DIMMs. The price change is on the DRAM, which is a commodity item. It's not like someone is going to discount it if you tell them "nah, I'm going to solder this one to my SoC".
If Apple is insulated it is likely because Apple signs big contracts for large supply and manufacturers would prefer to be insulated from short-term demand shocks and have some reliability that their fabs can keep running and producing profitable chips.
hoherd 2 days ago [-]
I also had that misunderstanding, so after seeing this comment I looking up info. In this article you can see the xray of the m1 chip composited onto the photo of the chip, which has external memory components. You can also see in the architecture diagram that the memory is attached from outside the area where the Fabric, CPU, GPU, NPU, cache, and some other unlabeled things are located. https://www.macrumors.com/guide/m1/
Perhaps on the same package (stacked) but absolutely not on the same die.
201984 2 days ago [-]
Which Apple product is this?
Memory dies and logic dies require entirely different factories to make, so I doubt this SoC exists.
2 days ago [-]
radicality 2 days ago [-]
I just checked how much I paid around 12 months ago for Crucial 96GB kit (2x48GB ddr5 5600 so-dimm). Was $224, same kit today I see listed at $592, wild :/
skwee357 2 days ago [-]
This is insane!
I got 2 sticks of 16GB DDR4 SODIMM for €65.98 back in February. The same two sticks in the same store now cost €186
pmdr 2 days ago [-]
Same, bought in August for $250 (EU), now it's ~$840. I ended up returning the laptop I'd bought it for and thought 'why hold on to the RAM, it'll only depreciate in value,' so I returned that too. Better hold on to my PS5, I guess.
justinram11 1 days ago [-]
Just bought that exact kit for my Minisforum 790S7 build at the eye watering $592... Kicking myself as I was just starting to contemplate it early Oct but not yet seriously looking
NewsaHackO 2 days ago [-]
I bought 2x 32 GB DDR5 in august for $150, Now its $440. I dodged a HUGE bullet.
ThatMedicIsASpy 2 days ago [-]
96GB 6400, 380€ 2023-11
tucnak 2 days ago [-]
I did buy 384 GB worth of Samsung DDR5-4800 sticks for my homelab a few months ago. I was wondering at the time if I really needed it, well ended up using it anyway, and turns out, dodged a bullet big time.
poemxo 1 hours ago [-]
Unless they stop making DDR5 and come out with DDR6, I think prices should return to normal next year.
tonymet 1 days ago [-]
The silver lining is that hopefully it’ll become too expensive to ship new Electron apps
Narishma 1 days ago [-]
You have too much faith in the software industry.
dysoco 21 hours ago [-]
This. Users and developers will just get accustomed to 2s delays everywhere. The era of native software is long gone.
tonymet 16 hours ago [-]
There are still snobs / autists like me who notice
tombert 1 days ago [-]
RAM has been cheap long enough and now no one remembers how to write efficient GUI apps.
I'm joking, but only kind of. It's not a domain that I do a lot of, but I haven't touched Qt in so long that it would basically be starting from scratch if I tried to write an app with it; I could write an Election app in like an hour.
tonymet 24 hours ago [-]
You can learn how to build a gui in Xcode or Visual Studio in a few hours
kasabali 19 hours ago [-]
Why do you think developers would care if it's expensive to users?
tonymet 19 hours ago [-]
There was a time
StableAlkyne 2 days ago [-]
It's crazy how much RAM has inflated in the last month. I checked the price history of a few DDR5 kits and most have tripled since September.
georgefrowny 2 days ago [-]
Why specifically just now? It doesn't seem that much has materially changed very recently.
debazel 2 days ago [-]
It's due to every hyperscalar building out new AI datacenters. For example you have Google recently saying things like "Google tells employees it must double capacity every 6 months to meet AI demand", and that they need to increase capacity by 1000x within 4-5 years.
debo_ 2 days ago [-]
The oft-snickered-at "smuggling 3mb of hot RAM" line from Neuromancer may have been prophetic after all.
retrac 2 days ago [-]
If you are a scifi author, it's a mistake to give any hard numbers in real-world units. You will, most likely, greatly underestimate. Even trying to greatly overestimate, you will underestimate.
Commander Data's specifications in the Star Trek TNG episode The Measure of a Man from 1989: 800 quadrillion bits of storage, computing at 60 trillion operations per second.
100 petabytes. That's a big machine. A very big machine. But supercomputers now have memories measured in petabytes.
They never used "bits" again in any Star Trek script. It was kiloquads and gigaquads from then on.
teach 2 days ago [-]
That's fun! To further prove your point I saw this and thought "yeah maybe 100 PB is more common these days but 60 trillion ops / second seems like a lot"
Then I did some googling and it turns out that a single 5090 GPU has a peak FP32 performance of over 100 TFLOPS!
nottorp 1 days ago [-]
Pretty sure Commander Data's software wasn't written in Electron so the hardware was enough :)
cyberjunkie 1 days ago [-]
Crypto: GPUs
AI: RAM
Thanks for taking away years of affordable computing from people. Time is more valuable; there's no getting it back.
beeflet 1 days ago [-]
I am skeptical of the narrative that crypto caused GPU demand to meaningfully increase. I think it was always AI
nottorp 1 days ago [-]
"We have always been at war with Eastasia" ?
dawnerd 2 days ago [-]
Noticed SSDs went up too. There's a "black friday" sales price for a 4TB crucial external drive that's at its highest price in 90 days.
Bad time if you need to build a computer.
rkagerer 2 days ago [-]
Article says:
Looking at it optimistically, you're probably going to find DDR5 at bargain prices again in 2027.
When do you think prices will recede again?
dathinab 2 days ago [-]
never fully, like with GPU, it a semi-cartel, it's in everything including you high performance SSD (as cache) they have a reason for them being supper high for ~2 years then they will go down but only "somewhat", lets say if the peak it >2x pricing the price in 2027 will be ~1.5x-1.8x price.
And because everything needs prices expect all electronics to be ~20%-80% more expensive in 2027 compared to today, naturally this includes the profit margin.
and naturally every regulation related companies don't like will supposedly be at fault for this (e.g. right to repair)
at least that is a wild speculation on my side
Modified3019 2 days ago [-]
I built 4 systems between Jan-May for myself and family, very fortuitous timing, because no way would I be doing it now.
caycep 2 days ago [-]
bc some people up top loved the idea of the old Smoot Hawley thing
StopDisinfo910 2 days ago [-]
RAM prices are cyclical. We are in the under supply part of the cycle.
People just have to wait. As prices are sky high, production capacity will likely increase. Some AI companies will go bust. Demand will plummet and we will buy RAM for pennies while the market consolidates.
bryanlarsen 2 days ago [-]
That's historically what happened when we had proper competition. Now we have a 3 party oligopoly and massive barriers to entry. Now at least 1 of the 3 is actively signalling than they're not going to not going to spend 100s of billions to expand fab capacity that will lower their profits because if one does it they'll all do it. It's a prisoner dilemma, and they're co-operating. When they co-operate we all lose.
timschmidt 2 days ago [-]
The entry of Chinese DRAM into the market may spur increased competition. If not for competition's sake alone, for national security and domestic supply chain integrity concerns.
Seattle3503 1 days ago [-]
Or the existing companies get fat, lazy, and then slaughtered.
nicolaslem 2 days ago [-]
That is also somewhat true for GPUs, hard drives and SSDs. They all usually have different cycles but today AI is making them peak all at the same time.
venturecruelty 2 days ago [-]
Great, we can eat the RAM when we're all unemployed.
f0rgot 1 days ago [-]
Dude - I laughed out loud on this comment (I guess we're at the cry or laugh stage). Almost woke up my sleeping kiddo.
lousken 2 days ago [-]
Everyone should unsub from this AI frenzy, this is ridiculous
I bought 32GB of DDR5 SODIMM last year for 108€ on Amazon. The exact same product that I bought back then is now 232€ on Amazon. I don't like this ride.
tombert 1 days ago [-]
Yeah, similar for me. I bought 64 gigs of DDR5 laptop RAM about a year ago; it ended up costing about $190. Now the exact same listing is going for $470. https://a.co/d/fJH1GkW
I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.
2 days ago [-]
ares623 1 days ago [-]
I thought a big factor with the AI hype is that hardware costs always go down. Is this not a huge red flag to investors?!
mkornaukhov 3 hours ago [-]
Sad for all of us =(
geerlingguy 2 days ago [-]
Ouch. Wondering if homelabs will be scavenged for unused RAM as even DDR4 is going up in price :(
Havoc 2 days ago [-]
I’ve been selling the ddr4 I had lying around. Also consider removing some from desktop since I don’t really use 64gb.
Hamuko 2 days ago [-]
I'm personally waiting for the first DDR5 heist. Breaking into a computer store and taking all of the RAM that isn't soldered down.
tabs_or_spaces 17 hours ago [-]
It started with GPUs, then hard drives and now RAM.
It will probably take a while, but is the general public going to be priced out of computers eventually?
nehal3m 2 days ago [-]
I picked up a PS5 today on a Black Friday deal for 350EUR. 32GB DDR5 is at around 280EUR at the moment.
I have a gaming PC, it runs Linux because (speaking as a Microsoft sysadmin with 10 years under my belt) I hate what Windows has become, but on commodity hardware it’s not quite there for me. Thought I’d play the PlayStation backlog while I wait for the Steam Machine.
tapoxi 2 days ago [-]
Don't forget to play Astro's Playroom, it comes with the system and it's a blast.
nehal3m 2 days ago [-]
Thanks, will do!
tylerflick 1 days ago [-]
Also, Spiderman 2 since it’s a Playstation exclusive. Incredible game worth every penny.
zombot 5 hours ago [-]
Shush, AI is good for all of us, don't you dare sow doubt about that.
HacklesRaised 2 days ago [-]
AI is a net negative fnyor anyone not in on the grift.
pmdr 2 days ago [-]
Hang in there, abundance is on its way! /s
ares623 1 days ago [-]
Abundance of PRs waiting for reviewers!
nathanaldensr 1 days ago [-]
I've been calling it "the last great bubble" whenever discussing it with my family and friends. This is Peak Grift.
mikepavone 2 days ago [-]
Wow, I only paid $265 for 96GB of DDR5 back in April. Same brand (G.SKILL) as the kit in the article too.
ekropotin 1 days ago [-]
Who could think that high end PC can be an appreciating asset, huh
prmoustache 1 days ago [-]
I don't understand using a game console as a price comparison.
Joyfield 1 days ago [-]
Can't be using the standard banana measurement for everything.
1 days ago [-]
Velocifyer 23 hours ago [-]
They actually mean 64GiB.
gnarlouse 2 days ago [-]
"2026: Cost of manufacturing PC cases increases 60% due to increased demand from Optimus production line" or some other dumb shit
AstroBen 1 days ago [-]
Holy shit the 32GB DDR5 I bought late october for $110 is now $300
Felt like I overpaid at the time too. Wow
sylware 1 days ago [-]
Why this shortage? Sudden demand increase? Issues with the supply of refined rare earth metals ?
christkv 1 days ago [-]
They have been sort of dumping the ps5 slim diskless over here in the EU at 349 EUR.
blindriver 1 days ago [-]
Will this affect the prices of Macbook Pros and Mac Studios, especially the 512 GB version?
lysace 2 days ago [-]
I'm waiting for the Apple TV 4k 4th gen. I think it might might be one or two more years, on top of the now three years from 3rd gen (2022).
AI/LLM companies will pay TSMC more than Apple is willing to further subsidize this neat little box.
ck2 2 days ago [-]
for anyone looking for a deal, thank me later, buy asap
ebay .com /itm/ 256168320806
(no association, just trying to help, I am still using DDR4)
BizarroLand 17 hours ago [-]
Holy crap.
I bought right after this curve hit, like the day after. I went into Microcenter for a new PC w/64gb ddr5. The day before, their kits were ~$189. The day I bought they were $360. Now the same kit on Newegg is $530.
It's been 2 weeks.
TheRealPomax 2 days ago [-]
Is it really a shortage, rather than unfair order fulfillment, when it's just four companies buying up everything? There's plenty of RAM, it's just getting sold to the people who yell the loudest instead of going "sorry we have more customers than just you" and fulfilling orders to everyone.
luxuryballs 2 days ago [-]
but can 64GB of DDR5 memory run Crysis? …I’ll see myself out
s5300 2 days ago [-]
[dead]
schmidtleonard 2 days ago [-]
[flagged]
forgetfulness 2 days ago [-]
He’ll have pissed off a lot of very rich and powerful people once they’re forced to accept that they’ll still need us plebes in the end, does he have a way out when that happens?
koakuma-chan 2 days ago [-]
Did Sam Altman buy out all gaming RAM?
John23832 2 days ago [-]
OpenAI bought 40% of all wafers. So yes.
rtkwe 2 days ago [-]
LLM models are voracious consumers of RAM for the top full power models so yes, not directly but they're all consuming massive amounts of the production capacity of memory makers it's impacting the 'gaming RAM' market too. They're not separate markets just because the end use is slightly different.
wkat4242 2 days ago [-]
No, but the factory capacity and raw materials needed to make that gaming dram is now under higher demand to make stuff like gddrx and hbm.
If a Bond villain bought all of the oil in the Middle East, would it affect prices in the US? (Yes, of course it would.)
dboreham 2 days ago [-]
Quick reminder that DRAM futures have existed since the 1980s so you all could have protected your price with calls.
venturecruelty 2 days ago [-]
Call me old-fashioned, but I shouldn't have to have a stock broker to buy a computer. Maybe we could re-organize society to be a bit less ridiculous. "Quick reminder that you could've been born rich instead of a povvo."
nrhrjrjrjtntbt 1 days ago [-]
Most of us would be better with a fixed rate mortgage given risk reward. You dont buy a computer every month (unless you do but then that is rare)
f0rgot 1 days ago [-]
where?
citizenpaul 1 days ago [-]
i deeply hope this is /s
2WSSd-JzVM 1 days ago [-]
Well the alternative is that GP solved the world hunger as people can just buy future food contracts to protect themselves from drought.
lvl155 1 days ago [-]
This is purely price gouging because these rams are not ECC and server grade.
zamadatix 1 days ago [-]
The article references the original coverage which talks to this:
> Despite server-grade RDIMM memory and HBM being the main attractions for hardware manufacturers building AI servers, the entire memory industry, including DDR5, is being affected by price increases. The problem for consumers is that memory manufacturers are shifting production prioritization toward datacenter-focused memory types and producing less consumer-focused DDR5 memory as a result.
But I'm sure the hysteria around that isn't helping prices come back down either.
nubinetwork 1 days ago [-]
Except when you have datacenters also building racks with desktop hardware. I believe that was hetzner?
Rendered at 12:00:35 GMT+0000 (Coordinated Universal Time) with Vercel.
If LLMs' utility continues to scale with size (which seems likely as we begin training embodied AI on a massive influx of robotic sensor data) then it will continue to gobble up memory for the near future. We may need both increased production capacity _and_ a period of more efficient software development techniques as was the case when a new 512kb upgrade cost $1,000.
Most DRAM is already purchased through contracts with manufacturers.
Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
Like most commodities, the price you see on places like Newegg follows the "spot price", meaning the price to purchase DRAM for shipment immediately. The big players don't buy their RAM through these channels, they arrange contracts with manufacturers.
The contracts with manufacturers will see higher prices in the future, but they're playing the long game and will try to delay or smooth out purchasing to minimize exposure to this spike.
> Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at.
Companies like Samsung and SK Hynix have DRAM fabs in China already. This has been true for decades. You may have Chinese fab'd DRAM in the computer you're using right now.
Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
Because oil & gas suppliers only ever sell one product, and memory fabs can dynamically switch product mix in response to supply & demand to optimize profits. The same sand, power and water can make DDR4, HBM or DDR5
I'm not sure I follow, varying range necessarily implies varying ratios (e.g. a product missing from the range means its ratio is zero).
Even when in theory you can obtain some higher quality products, the composition of the crude can make it too complex and expensive to practically obtain them.
You don't want to refine gasoline from heavy crude, especially in winter when demand is lower. For gasoline or kerosene you want to start from lighter crude. Same with many undesired components (either from the crude or resulting from the refining methods), the more you have, the more complex the refining, and the resulting ratio of products you obtain varies.
So in practice what you get out of the refining process absolutely depends on the characteristics of the crude, and many other things like market demand or the capability of your refinery.
Same as with silicon. The process to make the wafer results in different quality if you want to make low tech or cutting edge semiconductor products.
What do you mean "Break contracts"? I thought the conversation was about Futures contracts, you don't break them. You sell your contract or you take/give delivery (or cash settle).
Not all gas is sold by futures, you can have a contract for, say, delivery of 20 million cubic metres of gas a year and a penalty if that isn't met. Some people actually want the gas for gas-related purposes rather then as a financial phantom.
Same for DRAM - Dell actually wants the chips to put in computers, an economic abstraction doesn't help much when you need to ship real computers to get paid, and many customers aren't in the market for a laptop future (Framework pre-orders notwithstanding).
Various chemicals too, https://haz-map.com/Processes/97
Borrowing costs can be wildly variable and are the main cost of making silicon. All the "inputs" over the lifecycle of a fab are so completely dwarfed by the initial capital costs that you can pretty much ignore them in any economic analysis. The cost of making chips is the cost of borrowing money to pay for capital costs, and the depreciation of the value of that capital.
If only.
20 years ago, fabs were being built to use 90nm class technology. Chips made on such an old node are so cheap today it can't pay even fraction of a percent of the capital costs of the plant per year. So all of it's capital has to have been depreciated a long time ago.
The oldest process node in high-volume production for memory is currently 1α, which started production in January 2021. It is no longer capable of making high-end products and is definitely legacy, and also has to have essentially depreciated all of the capital costs. The time a high-end fab stays high-end and can command premium prices, and during which it has to depreciate all the capital is ~3-5 years. After that either you push the plant to produce legacy/low price and low margin items, or you rebuild it with new tools with costs >$10B.
Also, even if fabs did last 20-30 years, the capital costs would dominate.
> And wouldn't "wildly variable borrowing costs" also affect oil and gas who need to finance the research phase and construction of the plant?
I don't understand? Nothing else costs anywhere near as much capital to produce than silicon chips. Thanks to the inexorable force of Moore's second law, fabs are machines that turn capital investment into salable product, nothing like it has ever existed before.
Even if you pay them all 500k per year, that's "only" about a billion a year in payroll.
The New York fab plan costs something like 20 billion more or less now to build, with 100 billion over 20 years.
Also, maybe the calculus is different right now in the US, but it used to be the semiconductor workers were expected to have PhDs coming out of their ears but were not actually paid very well, with salaries in Taiwanese fabs being around the $50-60k mark, and lower paid workers being more like $20k or less. Presumably US fabs will be automated to an even greater extent due to labour costs.
So it's very possible that servicing debt on the capital outlay is substantially more expensive than the payroll.
Yes, via cxmt as discussed by Asianometry here: https://www.youtube.com/watch?v=mt-eDtFqKvk
As I mentioned, various groups within China have been working on China-native DRAM since the '70s. What's new are the margins and market demand to allow them to be profitable with DRAM which is still several years behind the competition.
A Japanese factory that made epoxy resin for chips was destroyed and the price of SIMM chips skyrocketed (due to lack of availability).
I remember being very upset that I wasn't going to be able to upgrade to 4MB.
But yes we're going to need more fabs for sure
If the shortage of RAM is because of AI (so servers/data centers I presume?), wouldn't that mean the shortage should be localized to RDIMM rather than the much more common UDIMM that most gaming PCs use? But it seems to me like the pricing is going up more for UDIMM than RDIMM.
Wouldn't that mean that a shortage of DRAM chips should cause price difference in all of them? Not sure that'd explain why RDIMM prices aren't raising as sharply as UDIMM. That the fab and assembly lines have transitioned into making other stuff makes sense why'd there be a difference though, as bradfa mentioned in their reply.
The manufacturers make the individual chips, not the modules (DIMMs). (EDIT: Some companies that make chips may also have business units that sell DIMMS, to be pedantic.)
The R in RDIMM means register, aka buffer. It's a separate chip that buffers the signals between the memory chips and the controller.
Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
It can be confusing. The key thing to remember is that the price is driven by the price of the chips. The companies that make DIMMs are buying chips in bulk and integrating them on to PCBs.
Quite a few unbuffered designs in the past had a "missing chip". If you ever wondered why a chip was missing on your stick, it's missing ECC. Don't know if it's still the case with DDR5 though.
Also with DDR5 each stick is actually 2 channels so you get 2 extra dies.
Was reading a series of displeased posts about it. Can't seem to find it now.
Which, every modern ECU will do automatically based on output from the knock sensors.
Also vary a bit between winter a summer, basically in winter they can get away with putting a bit more volatile compounds coz it's colder
I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.
Compute is cheaper than ever. The ceiling is just higher for what you can buy.
Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.
Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.
Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.
Low end is ryzen integrated graphics now, xx60 is mid range at best. Maybe even xx50 if those still exist.
"It's mid range if it exists" doesn't make sense.
Also you're missing that they're talking about 3070, a card from 2020 (5 years ago), 2 generations behind this year's 50xx series. The 30xx matters more than the xx70 here. It was an upper midrange card when it came out, and it's solidly midrange for Nvidia's product lineup today. You can have cheaper and decent just fine (integrated Ryzens like you mentioned are fine for 1080p gaming on most titles).
So no one makes a 25k model.
This is missing the forest for the trees quite badly. The 2000 price GPUs are what would've been previously 600-700, and the 200-400 dollar GPUs are now 600-700. Consumers got a shit end of the deal when crypto caused GPUs to spike and now consumers are getting another shitty deal with RAM prices. And even if you want mid range stuff it's harder and harder to buy because of how fucked the market is.
It would be like if in your example companies literally only sold F-150s and stopped selling budget models at all. There isn't even budget stock to buy.
A GTX 1080 came out in the first half of 2016. It had 8 GB of VRAM and cost $599 with a TDP of 180W.
A GTX 1080 Ti came out in 2017 and had 11 GB of VRAM at $799.
In 2025 you can get the RTX 5070 with 12 GB of VRAM. They say the price is $549, but good luck finding them at that price.
And the thing with VRAM is that if you run out of it then performance drops off a cliff. Nothing can make up for it without getting a higher VRAM model.
I did one Google search for "rtx 5070 newegg usa" and they have MSI Ventus GeForce RTX 5070 12G down from $559 to $499 for Black Friday, and ASUS Prime RTX 5070 12GB for $543.
https://www.newegg.com/msi-geforce-rtx-5070-12g-ventus-2x-oc...
https://www.newegg.com/asus-prime-rtx5070-12g-geforce-rtx-50...
-the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...
-for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble
(granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).
on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
3Dfx was not the inventor of the GPU. There’s a long history of GPU development for corporate applications.
The iPhone wasn’t the first mobile phone. Early mobile phones were very expensive and targeted as businesses who wanted their executives in touch
You’re still thinking from a consumer-centric view. Zoom out and those consumer companies were not the first to develop the products. You didn't even think about the actual originators of those types of products because you don’t see them as a consumer.
The iPhone isn't exactly a consumer computation device. From that perspective, it does less work at a higher cost.
(I remember the huge window in which phone companies desperately put out feature phones with sub-par touch screens, completely missing the value to consumers. The iPod Touch should've been warning enough... and should've been (one of) my signal(s) to buy Apple stock, I guess :-)
HN is strange. I have an old gaming build from 7-8 years ago and while it can do high end games on low settings and resolution, it doesn’t hold a candle to even a mid-range modern build.
“viable” is doing a lot of work in that claim. You can tolerate it at low res and settings and if you’re okay with a lot of frame rate dips, but nobody is going to mistake it for a modern build.
You’re also exaggerating how fast video cards became obsolete in the past. Many of us gamed just fine on systems that weren’t upgraded for 5-6 years at a time.
And back in the early 2000s, even bleeding edge current-year rigs would struggle with new games like Doom 3, Far Cry, Crysis, and so on. Hardware was advancing so rapidly that games were being built in anticipation of upcoming hardware, so you had this scenario where high end systems bought in one year would struggle with games released that year, let alone systems from 5-6 years prior.
Obviously if you're referencing CRPGs and the like, then yeah - absolutely anything could run them. The same remains even more true today. Baldur's Gate 3's minimum requirement is a GTX 970, a card more than 11 years old. Imagine a 1989 computer trying to run Baldur's Gate 2!
[1] - https://www.youtube.com/watch?v=PRHjzDg_VHw
- https://news.ycombinator.com/item?id=44499245
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
I have a flagship 7-8 year old GPU in one machine and a mid-level modern GPU in another.
It’s flat out wrong to claim “very little gains” during this time. The difference between those two GPUs is huge in games. The modern GPU also does it with far less power and noise.
I can’t understand this HN mentality that modern hardware isn’t fast or that we’re not seeing gains.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
Many 3d games like doom, quake1, flight unlimited,etc ran purely on software rendering since CPU's were already providing enough oomph to render fairly useful 3d graphics in the mid 90s. CPU power was enough but consoles/arcades showed that there was more to be gotten (but nothing hindered games at that point).
And already there, the capital investment for game consoles (Atari,NES,SNES,PS1,PS2, etc) and arcade games(like the above mentioned 3d games) were big enough to use custom chipsets not used or purposed for anything else (I think also that in the 80s/90s the barrier of entry to making competitive custom chips was a tad lower, just consider the cambrian explosions of firms during the 90s making x86 and later ARM chips).
Yes, there was vendors that focused on the high end commercial customers, and yes many alumnis of those firms did contribute a ton of expertise towards what we have today.
But if you look at what companies survived and pushed the envelope in the longer run it was almost always companies that competed in the consumer market, and it was only when those consumer chips needed even more advanced processing that we breached the point where the chips became capable of NN's.
In fact I'd say that had the likes of SGI prevailed we would've had to wait longer for our GPU revolution. Flight simulators,etc were often focused on "larger/detailed" worlds, PS2-era chips with higher polycounts and more memory would have been fine for simulator developers for a long time (since more details in a military scenario would have been fine).
Leisure games has always craved fidelity on a more "human" level, to implement "hacks" for stuff with custom dynamic lighting models, then global illumination, subsurface scattering,etc we've needed the arbitrary programmability since the raw power wasn't there (the most modern raytracing chips are _starting_ to approach that levels without too ugly hacks).
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)
The move to cloud computing and now AI mean that we're back in the mainframe days.
You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free
Living on the edge from 4 years ago is basically free.
Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.
The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.
People ran multiple browser windows, a 3D video game, irc (chat application), teamspeak/ventrilo (voice chat) and winamp (music) all at once back in the early 2000s. This is something an 8 year old phone can do these days.
That an 11-year-old PC can keep up today (with or without an upgrade) is evidence that systems are keeping up with software bloat just fine. :)
Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.
Besides, you just said you only needed 512MB, which is still nothing these days.
I didn't say I "only needed 512MB", only that things were a lot better once we got a 512MB machine. Things continued to get massively better as I upgraded to a 1GB machine, an 8GB machine, etc.
> I'm talking in general
Isn't doing some light picture editing/organizing, playing back multimedia, etc. pretty dang general computer use these days? Or what, is "general" computer usage entirely limited to 80 column text manipulation? You'd have a hard time even just keeping my displays drawn with 64MB of memory at the resolutions and bit depths and multiple desktops that are common.
I play around with retro computers (especially the early/mid 90s for that nostalgia) and I'm constantly reminded of how little memory we really had to play with back then, and these are on pretty much fully loaded home desktop machines. Have a Word document open and you're trying to play back an MP3 and have a couple browser windows open? Oof, good luck! You want to stream a video? I hope its about 10FPS at 320x240! Opening one photo from my camera today and you'll have used half the memory before its even hit the framebuffer.
The manufacturers are scumbags is more likely answer.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
Anyway, that's the kind of market that governments always need to act upon and either supply directly or regulate intensively.
But my guess is that this shortage is short-lived (mostly because of the threat above). There's no OPEC for tech.
Being shocked that companies try their best to deal with the bad cards they have been dealt with should be expected. The money system simply cannot express the concept of surplus capital or abundance. Positive interest means capital is scarce, so capital must be made scarce even if there is abundance.
Before you come up with the argument that the interest rate is supposed to reflect a market property and therefore does not force itself upon the market, remember that I said that there is an artificial restriction in the money system that prevents the state of the real market to be expressed. The non-profit economy has never had a chance to exist, because our tools are too crude.
The non-profit economy includes resilient production with slight/minor overproduction.
Think about how stupid the idea of a guaranteed 0% yield bond is (aka cash). The government obligates itself to accept an infinite amount of debt if the real return on capital would ever fall negative. No wonder it has an incentive to inflate the value of the bond away.
Maximizing profit is the only sane way to play a rigged game
https://www.trendforce.com/price/dram/dram_spot
Unforseen things like the pandemic hurt profits.
Letting things go this unmanaged with a 3 year run way for AI demand seems a little hard to understand. In this case, not anticipating demand seems to creates more profit.
Next is probably CPUs, even if AIs don't use them that much, manufactures will shift production to something more profitable, then gouge prices so that only enterprises will pay for them.
What's next? Electricity?
Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
Things being too cheap allows money to pool at the bottom in little people's hands in the forms of things like "their homes" and "their computers" and "their cars".
You don't really want billions in computing hardware (say) being stashed down there in inefficient, illiquid physical form, you want it in a datacentre where it can be leveraged, traded, used as security, etc. If it has to be physically held down there, ideally it should be expensive, leased and have a short lifespan. The higher echelons seem apparently to think they can drive economic activity by cycling money at a higher level amongst themselves rather than looping in actual people.
This exact price jump seems largely like a shock rather then a slow squeeze, but I think seeing some kind of reversal of the unique 20th century "life gets better/cheaper/easier every generation".
To me the #1 most important factor in a maintaining a prosperous and modern society is common access to tools by the masses, and computing hardware is just the latest set of tools.
Yes, that's the point. People fixing things themselves doesn't make the line go up, therefore it will be made harder.
And I assume some of them read these threads, so my advice to them would be to remember that the bunker air vents will probably be the main weak point.
In the hands of the owners of the AI, as a direct consequence of the economic system. It was never going to play out any other way.
An abundance of intelligence on Earth with all its spoils: new medicine, energy, materials, technologies and new understandings and breakthroughs - these seem quite significant to me.
Super-intelligence is a completely different can of worms. But I'm not optimistic about super-intelligence either. It seems super naive to me to assume that the spoils of super-intelligence will be shared with the people who no longer can bring anything to the table. You aren't worth anything to the super-rich unless you can do something for them which the super-intelligence can't do.
And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on my phone - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
You still need to be rich to partake. Most business ventures will still require capital even in the age of super-intelligence. Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
> And when did "the rich" hoard anything for themselves only?! Usually I see them democratizing products and services so they are more accessible to everyone, not less.
There are plenty of examples of rich people hoarding their wealth. Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth. There is little reason why super-intelligence will not lead to a resource curse where the resource is human intelligence or even human labor.
> Computers in my pocket and on my wrist, TVs as big as a wall and thin like a book, electric cars, flights to anywhere I dream of traveling, investing with a few clicks on a website - all made possible to me by those evil and greedy rich in their race for riches. Thank you rich people!
Those rich people didn't share with you out of the goodness of their heart but because it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
Again, you can invest, today, in AI stocks and ETFs, with just $100 and a Robinhood account. No need to be rich.
> Super-intelligence will make labor worthless (or very cheap) it won't make property worthless.
If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
> Countries with natural resources often have poor citizens because those citizens are not needed to extract that wealth.
Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
> it was their best strategy to become even richer. But that's no longer the case when you can be replaced by super-intelligence.
How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services? Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
And then? I'll compensate the loss of thousands of dollars I don't earn anymore every month with the profits of a $100 investment in some ETF?
> If the labor is worthless, the great majority of people will be poor. Due to the law of supply & demand, property will be worthless since there will be very little demand for it.
Property has inherent value. A house I can live in. A farm can feed me. A golf course I can play golf on. These things have value even if nobody can buy them off me (because they don't have anything I want). Supply and demand determine only the _price_ not the _value_ of goods and services.
> Countries with or without resources often have poor citizens simply because being poor is the natural state of mankind. The only system that, historically, allowed the greatest number of people to exit poverty is capitalism. Here in Eastern Europe we got to witness an astonishing change of fortunes when we switched from communism to capitalism. The country and its resources didn't change, just the system and, correspondingly, the wealth of the population.
None of this has any connection to anything I've written. I'm talking about the concept of a resource curse. Countries rich in natural resources (oil, diamonds, ...) where the population is poor as dirt because the ruling class has no incentive to share any of the profits. The same can happen with AI if we don't do anything about it.
> How can they become richer when most people are dirt broke (because they were replaced by AIs) and thus can't buy their products and services?
Other rich people can buy their products and services. They don't need you to buy their products and services because you don't bring anything to the table because all you have is labor and labor isn't worth anything (or at least not enough to survive off it). Put differently: Why do you think rich people would like to buy your labor if using AI/robots is cheaper? What reason would they have to do that?
> Look at how even Elon's fortunes shrink when his company misses a sales forecast. He is only as rich as the number of customers he can find for his cars.
You're proving my point: Elon still lives in a world where labor is worth something. Because Elon lives in a world where labor is worth something it is in his interest that there are many people capable of providing that labor to him. This means it is in his interest that the general population has access to food and water, is well eduacated, ...
If Elon were to live in a world where labor is done by AI/robots there would be little reason for him to care. Yes, he couldn't sell his cars to the average person anymore, but he wouldn't want to anyway. He could still sell his cars to Altman in exchange for an LLM that strokes his ego or whatever rich people want.
The point is: Because rich and powerful people still have to pay for labor, their incentives are at least somewhat aligned with the incentives of the average person.
We know that none of the goods you listed would be available to the masses unless there was profit to be gained from them. That's the point.
I have a hard time believing a large group being motivated and mutually benefiting towards progression of x thing would result in worse outcomes than a few doing so. We just have never had an economic system that could offer that, so you assume the greedy motivations of a few is the only path towards progress.
Please propose it yourself.
> you assume the greedy motivations of a few is the only path towards progress
No. I assume the greedy motivations of the many is the best path towards progress. Any other attempts to replace this failed miserably. Ignoring human nature in ideologies never works.
If you want to look at what historically has happened when the rich have had a sudden rapid increase in intelligence and labor, we have examples.
After the end of the Punic wars, the influx of slave labor and diminution of economic power of normal Roman citizens lead to: accelerating concentration of wealth, civil war and an empire where the value of human life was so low that people were murdered in public for entertainment.
Yet those things did not happen in communist countries (or happened way less in socialist ones), during the same time period, even though the market was there too. That is why EU's socialist countries consume high tech products and services from the USA and not the other way around.
Humanity will have to adopt new human-focused modes of living and organizing society, or else. And climate change is coming along to make sure the owning class can't ignore this fact any longer.
But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Because I grew up under communism and I lived its miserable failures: the non-profit system didn't even manage to feed, cloth or warm/cool us.
> new human-focused modes of living and organizing society
Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
There are shades of grey here. Capitalism is a system with many inherent problems. Exploring alternatives is not the same thing as being a Stalinist
It's like the lack the most basic understanding of economics and they never read any history. I mean, communism has failed everywhere it was tried and there were so many A/B test that plainly show each system's results: North vs South Korea, Eastern Europe before vs after 1990, USA vs USSR, Argentina during the last hundred years, Venezuela before and after Chavez, etc.
Or they push socialism under new names ("democratic") as if it's a new thing, not just a watered down form of communism, with authoritarian communism being the logical end game of socialism - because "at some point you run out of other people's money" and you need force to keep fleecing them. Just like it happened in Venezuela...
Capitalism increasingly fails to provide well-being to the majority of the global population. It's obvious we need to come up with something else, even if it's not clear yet what shape that will take.
If we can't find an alternative that works, we can also just wind down humanity, and not much of value to the universe will be lost :)
You don't need to go full communist to make things better.
We have to go cry to "daddy Trump" for protection, unable to even defend ourselves when a greedy, blood-thirsty country with an economy smaller than Italy's decided to attack.
Regulating health care made medical research escape to USA. Regulating building created the biggest housing crisis affecting young couples most, in turn reducing natality even further. Regulating energy pushed us in the warm embrace of the Russian bear who exploited our dependency to the max. GDPR ensured no web-based EU startup can be competitive to the US-ones. Regulating AI was just funny at that point since it is all made in China and the USA anyway...
Yeah, you don't need to go full communist to f things up but the closer you go, the worse things get.
> But please, don't be coy: tell us about that other system that is designed for "human flourishing" - we're dying to learn about it.
Libertarian socialism, anarchocommunism, any system where human freedom is the basis, and not coercion or hierarchy. This stuff is not new or radical, it's just not favored by people with lots of money to lose.
> Oh, these sounds sooo promising. Please do tell us: would you by any chance be willing to use force to "convince" the laggards of the benefits of switching? What if some refuse to believe your gospel? Will you turn to draconic laws and regulations?
Lucky for you, no. The complete opposite. Freedom of association is the entire foundation of it. We all get to associate with whomever we want, when and for as long as we want. Someone being a condescending prick in your local comment section? You get to ignore them! No commissars or cops or Party. Someone wants to go play hierarchical capitalism with his friends? As long as he's not messing with other people or contravening their rights, they get to do whatever they want.
Will any of these systems result in 99 cent stores, fast food restaurants, or people on the moon? Almost definitely not. But those are all irrelevant to creating a sustainable environment designed for human beings, and not profit.
The lack of innovation (or even reading of basic history...) in what is possible in terms of organizing human societies is frankly sad, especially among tech workers. Most people are too influenced by capitalism (intentionally so) to believe that how things are now is the only way they can be. There is so little scope for innovation and change, and that starts with the owning class who have no interest in it changing.
With a hundred bucks and a Robinhood account, you too can be part of this greedy, evil and mysterious "owners of AI" class and (maybe) some day enjoy the promised spoils.
Oh the wonders of Capitalism, the economic system offering unequal abundance to everyone caring to take part... Where are the other, much touted systems, masters at spreading misery equally?
Yes. My electricity prices jumped 50% in 3 years.
How much is due to long overdue infrastructure upgrades and greed by providers, vs the cost of energy?
Also, consumer prices _have_ risen (mine included), but it's not clear that this is only because AI. While EV charging is not at the scale of all data centers combined, it seems to grow even faster than the datacenter's consumption, and is expected to eclipse the latter around 2030. Maybe sooner due to missing solar incentives.
Also, to rant on: According to [1], an average Gemini query costs about 0.01 cents (Figure 2 - say 6000 queries per kWh at 60 cents/kWh, which is probably more than the industrial consumers pay). The same paper says one other providers is not off by that much. I dare say that at least for me, I definitely save a lot of time and effort with these queries than I'd traditionally have to (go to library, manually find sources on the web, etc), so arguably, responsibly used, AI is really quite environmentally friendly.
Finally: Large data centers and their load is actually a bit fungible, so they can be used to stabilize the grid, as described in [2].
I would think it would be best if there were more transparency on where the costs come from and how they can be externalized fairly. To give one instance, Tesla could easily [3] change their software to monitor global grid status and adjust charging rates. Did it happen ? Not that I know. That could have a huge effect on grid stability. With PowerShare, I understand that vehicles can also send energy back to power the house - hence, also offload the grid.
[1] https://services.google.com/fh/files/misc/measuring_the_envi...
[2] https://www.linkedin.com/feed/update/urn:li:activity:7358514...
[3] that's most likely a wild exaggeration
This only makes sense if you ignore profits. We've been paying the bills since before this was "overdue"; for instance i am still paying a storm recovery surcharge on my electric bill from before i ever moved to this state. At the point where a "temporary infrastructure surcharge for repairs" becomes a line item on their profit statement, that's where i start to get real annoyed.
Our electric company has 287,000 customers and has a market cap of >$800,000,000
what percentage of that eight tenths of a billion in market cap came from nickel and diming me?
* note: nickel and dime was established as "an insignificant amount of money" in the 1890s, where sirloin, 20% fat was $0.20 a pound. That's $13.50 now (local); chuck $0.10 and $19 now. So somewhere between 67 and 180 times less buying power from a nickel and dime, now. Also that means that, y'know, my surcharges being $15-$30 a month is historically "nickels and dimes"
https://babel.hathitrust.org/cgi/pt?id=uiug.30112019293742&s...
I mean part of me thinks it's a necessary evil because we relied too much on Russian gas in the first place. But that's because we extracted most of our own gas already (https://en.wikipedia.org/wiki/Groningen_gas_field), which that article lists as one of the factors in the Dutch welfare state being a thing - it and smaller fields out at sea contributed over 400 billion to the Dutch economy since the 1950's.
That and water. Electricity: Google made a post about scaling k8s to 135.000 nodes yesterday, mentioning how each node has multiple GPUs taking up 2700 watts max.
Water, well this is a personal beef, but Microsoft built a datacenter which used potable / drinking water for backup cooling, using up millions of liters during a warm summer. They treat the water and dump it in the river again. This was in 2021, I can imagine it's only gotten worse again: https://www.aquatechtrade.com/news/industrial-water/microsof...
Why do you think this is a lot of water? What are the alternatives to pulling from the local water utility and are those alternatives preferable?
[0] https://en.wikipedia.org/wiki/North_Holland
[1] https://en.wikipedia.org/wiki/Water_supply_and_sanitation_in...
[2] https://www.dutchdatacenters.nl/en/statistics-2/
If you make $4k/mo and rent is $3k, it's pretty silly to state that it's a meaningful thing for someone to scrimp and invest $100/mo into a brokerage account.
They definitely should do this, but it's not going to have any meaningful impact on their life for decades at best. Save for a decade to get $12k in your brokerage account, say it doubles to $24k. If you then decide you can get a generous 5% withdrawal rate you are talking $600/yr against rent that is now probably $3500/mo or more. Plus you're killing your compounding.
It's good to have so emergencies don't sink you - but it's really an annoying talking point I hear a lot lately. Eye rolling when you are telling someone struggling this sort of thing.
It really only makes a major impact if you can dump large amounts of cash into an account early in life - or if you run into a windfall.
5% of that would be $8100.
Is $48k / year a typical income?
Yeah obviously if i can sock away $12,000 a year for 10 years i'll have money. Just be aware that i was in a bunch of funds from 2010-2020 and were it not for what happened at the end of that decade i wouldn't have made any additional money at all. In fact, i would have lost a decent chunk of money - not just in fees, but inflation.
Also, where are you guaranteed 6% for a decade? t-bills or something?
The centabillionaire who invests their fortune and receives $50M a day from the market and cares about nothing more than keeping the gravy train going is the problem. They head to Washington and splash their free money hose around in exchange for political support to keep that hose pumping at all costs. That's why politicians were so lock-step about how it was super important for the US to sell our industry to China and drop capital gains tax below income tax and open the Double Irish Sandwich and now they're getting rid of social programs and instituting 50 year mortgages and so on.
The fact that the guy with the firehose can pump the firehose by stepping on the guy investing $1k month is the core of the problem. Until you are at escape velocity -- your net worth is enough to cover your lifestyle from investment returns alone -- you lose by default.
like under a bridge or something? Pardon the hyperbole, but you would have to assume people with no disposable income are idiots in order to suggest that solution.
Theres some exceptions, like rural doctors can make more more than city doctors due to high demand. But the less "physical" your job is, the rarer these exceptions become.
For software devs, you can move out of silicon valley. Maybe to Texas. And now those 1.5 million dollar homes are only 700,000 dollars. But your salary will reflect that.
AI will lead to abundance. For those that own stuff and no longer have to pay for other people to work for them.
Why are you saying that? Anybody working for a living (but saving money) can invest in AI stocks or ETFs and partake in that potential abundance. Robinhood accounts are free for all.
> (but saving money)
These two are already difficult or impossible for many people. Especially a big chunk of USAmericans have been living paycheck to paycheck for a long time now, often taking multiple jobs just to make ends meet.
And then to gamble it on a speculative market, whose value does not correlate with its performance (see e.g. Tesla, compare its sales / market share with its market value compared to other car manufacturers). That's bad advice. And for an individual, you'd only benefit a fraction from what the big shareholders etc earn. Investing is a secondary market.
That doesn't mean they are poor, just poor with their money. Saving is a skill that needs to be learned.
> That's bad advice.
No. Not investing, when the S&P500 index had a 6% inflation-adjusted annual historical return over the last 100 years - is bad advice. Not hedging the arrival of an AGI that you think can replace you - is bad advice.
Yes, you can drill your own well to have water in a society, for some. Or, you come up with the unheard-of idea of public utilities, so people can simply open a tap and enjoy. In some regions, even drink it. Personally, growing up in a lucky place like that, I have a hard time imagining to ever live in a place that required me to buy bottled water.
Yes, you can demand each member of society to learn about ETFs. Personally, I enjoy every part of life where complexity is being dealt with for me; I wouldn’t survive a day without that collaboration.
We have a choice to design society, like we have a choice to design computer systems.
Investing in AI companies is just about the last piece of advice I'd give someone who's struggling financially.
The billionaires will largely be fine. They hedge their bets and have plenty of spare assets on the side. Little guy investors? Not so much. They'll go from worrying about their retirement plan to worrying about becoming homeless.
I remember when there was a flood somewhere in Thailand in the 2011 and the prices of hardisks went up through the roof.
https://www.forbes.com/sites/tomcoughlin/2011/10/17/thailand...
That can be a bigger problem for civilization.
The AI bubble has also pushed up secondhand prices. I work in ewaste recycling. Two weeks ago, a petabyte's worth of hard drives showed up. After erasing, testing, listing, and selling them, I'll have a very merry Christmas.
That'll come with the bubble bursting and the mass sell off.
How long do you estimate this period of supply constraint will be? Will manufacturers continue to be greedy, or will they grow less greedy as the supply improves based on the price signals the high price indicates?
next?
https://www.cnbc.com/2025/11/14/data-centers-are-concentrate...
more money for shareholder, 5 Trillion Nvidia???? more like a quadrillion for nvidia market cap
Abundance isn't even the right framing. What most people actually want and need is a certain amount of resources - after which their needs are satiated and they move onto other endeavors. It's the elites that want abundance - i.e. infinite growth forever. The history of early agriculture is marked by hunter-gatherers outgrowing their natural limits, transitioning to farming, and then people figuring out that it's really fucking easy to just steal what others grow. Abundance came from making farmers overproduce to feed an unproductive elite. Subsistence farming gave way to farming practices that overtaxed the soil or risked crop failure.
The history of technology had, up until recently, bucked this trend. Computers got better and cheaper every 18 months because we had the time and money to exploit electricity and lithography to produce smaller computers that used less energy. This is abundance from innovation. The problem is, most people don't want abundance; the most gluttonous need for computational power can be satisfied with a $5000 gaming rig. So the tech industry has been dealing with declining demand, first with personal computers and then with smartphones.
AI fixes this problem, by being an endless demand for more and more compute with the economic returns to show for it. When AI people were talking about abundance, they were primarily telling their shareholders: We will build a machine that will make us kings of the new economy, and your equity shares will grant you seats in the new nobility. In this new economy, labor doesn't matter. We can automate away the entire working and middle classes, up to and including letting the new nobles hunt them down from helicopters for sport.
Ok, that's hyperbole. But assuming the AI bubble doesn't pop, I will agree that affordable CPUs are next on the chopping block. If that happens, modular / open computing is dead. The least restrictive computing environment normal people can afford will be a Macbook, solely because Apple has so much market power from iPhones that they can afford to keep the Mac around for vanity. We will get the dystopia RMS warned about, not from despotic control over computing, but from the fact that nobody will be able to afford to own their own computer anymore. Because abundance is very, very expensive.
https://www.bbc.co.uk/news/articles/c246pv2n25zo
It may have been a bit self-deprecating, but I think your “rant” is a more than justified question that really should be expanded well beyond just this matter. It’s related to a clear fraud that has been perpetrated upon the people of the western world in particular for many decades and generations now in many different ways. We have been told for decades and generations that “we have to plunder your money and debase of and give it to the rich that caused the {insert disaster caused by the ruling class} misery and we have to do it without any kind of consequences for the perpetrators and no, you don’t get any kind of ownership or investment and we have to do it now or the world will end”
What is your source on that? Moore's Law is Dead directly contradicts your claims by saying that OpenAI has purchased unfinished wafers to squeeze the market.
https://www.youtube.com/watch?v=BORRBce5TGw
Tom’s Hardware: “Samsung raises memory chip prices by up to 60% since September as AI data-center buildout strangles supply,” Nov 2025. https://www.tomshardware.com/tech-industry/samsung-raises-me...
Note the consistent "up to 60% since September" figure in the above recent reports. That's for one module capacity, with others being up 30% to 50% - and it certainly isn't the 200% or more we're apparently seeing now in the retail market. That's pure panic hoarding, which is actually a very common overreaction to a sudden price spike.
As someone currently fighting to shave megabytes off a C++ engine, it hurts my soul to see a simple chat app (Electron) consume 800MB just to idle. We spent the last decade using Moore's Law to subsidize lazy garbage collection, five layers of virtualization, and shipping entire web browsers as application runtimes. The computer is fast, but the software is drowning it.
The memory business is a pure commodity and brutally cyclic. Big profit => build a fab => wait 2 years => oh shit, everyone else did it => dump units at below cost. Repeat.
https://en.wikipedia.org/wiki/Pork_cycle
> Memory: 16 GB GDDR6 SDRAM
So unless the RAM price jumps to 4x the price of a PS5, getting a PS5 is not the most cost efficient way to get to 64 GB of RAM.
In comparison, PS3 has been used to build cheap clusters[2].
[1]: https://en.wikipedia.org/wiki/PlayStation_5
[2]: https://en.wikipedia.org/wiki/PlayStation_3_cluster
Props on building a PC with your kid, I have very fond memories of doing that with my dad. Have fun!
I'm now seeing $480 CAD for a single stick.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
The PS5 has 16GB of RAM. One can buy 16GB of RAM for ~$100 [1].
[1] https://pcpartpicker.com/product/9fgFf7/kingston-fury-beast-...
But since it's 16 GB, the comparison doesn't really make sense.
On a PC you may have the bright idea to open a browser along with the game for walkthroughs/hints. Or Discord to chat with your friends while gaming.
Due to javascript bloat, your working set size goes from 16 to 48-64 Gb in a jiffy.
You do have the option to open up Discord voice chats on PS5. Amazing what Discord could do when forced to actually write something efficient.
Youtube also exists as an app, and maybe you can trick the heavily gimped built in browser to go there as well, although last I checked it wasn't trivial.
Can’t use a ps2 controller to play a ps2 game on a ps2 without the ps2 console.
If this is still true or not, I don’t know. I do know that the ps5 with an optical drive cost $100 more than the digital edition. I also know that the drive does not cost $100 and sincerely doubt the labor makes up the difference.
So maybe I talked myself out of my whole point.
Gaming consoles are something people buy. Any parent or gamer has an idea what they cost.
People do not buy London Buses themself
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
I wonder what you'd think if bus tires exploded in price and started costing .25 London busses per tire.
I'd bet that a good chunk of the apparently sudden demand spike could be last month's Microsoft Windows 10 end-of-support finally happening, pushing companies and individuals to replace many years worth of older laptops and desktops all at once.
Big manufacturers also order their DRAM in advance with contractually negotiated pricing. They're not paying these spot market prices for every computer they ship.
~~~~~helps that Apple's SoC has the RAM on the main die itself. They're probably immune from these price hikes, but a lot of the PC/Windows vendors would, which would only make Apple's position even stronger~~~~
If Apple is insulated it is likely because Apple signs big contracts for large supply and manufacturers would prefer to be insulated from short-term demand shocks and have some reliability that their fabs can keep running and producing profitable chips.
And in this article you can see a photo of the memory chips attached outside of the Apple component https://www.gizmochina.com/2020/11/19/apple-mac-mini-teardow...
I got 2 sticks of 16GB DDR4 SODIMM for €65.98 back in February. The same two sticks in the same store now cost €186
I'm joking, but only kind of. It's not a domain that I do a lot of, but I haven't touched Qt in so long that it would basically be starting from scratch if I tried to write an app with it; I could write an Election app in like an hour.
Commander Data's specifications in the Star Trek TNG episode The Measure of a Man from 1989: 800 quadrillion bits of storage, computing at 60 trillion operations per second.
100 petabytes. That's a big machine. A very big machine. But supercomputers now have memories measured in petabytes.
They never used "bits" again in any Star Trek script. It was kiloquads and gigaquads from then on.
Then I did some googling and it turns out that a single 5090 GPU has a peak FP32 performance of over 100 TFLOPS!
AI: RAM
Thanks for taking away years of affordable computing from people. Time is more valuable; there's no getting it back.
Bad time if you need to build a computer.
Looking at it optimistically, you're probably going to find DDR5 at bargain prices again in 2027.
When do you think prices will recede again?
And because everything needs prices expect all electronics to be ~20%-80% more expensive in 2027 compared to today, naturally this includes the profit margin.
and naturally every regulation related companies don't like will supposedly be at fault for this (e.g. right to repair)
at least that is a wild speculation on my side
People just have to wait. As prices are sky high, production capacity will likely increase. Some AI companies will go bust. Demand will plummet and we will buy RAM for pennies while the market consolidates.
I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.
It will probably take a while, but is the general public going to be priced out of computers eventually?
I have a gaming PC, it runs Linux because (speaking as a Microsoft sysadmin with 10 years under my belt) I hate what Windows has become, but on commodity hardware it’s not quite there for me. Thought I’d play the PlayStation backlog while I wait for the Steam Machine.
Felt like I overpaid at the time too. Wow
AI/LLM companies will pay TSMC more than Apple is willing to further subsidize this neat little box.
I bought right after this curve hit, like the day after. I went into Microcenter for a new PC w/64gb ddr5. The day before, their kits were ~$189. The day I bought they were $360. Now the same kit on Newegg is $530.
It's been 2 weeks.
> Despite server-grade RDIMM memory and HBM being the main attractions for hardware manufacturers building AI servers, the entire memory industry, including DDR5, is being affected by price increases. The problem for consumers is that memory manufacturers are shifting production prioritization toward datacenter-focused memory types and producing less consumer-focused DDR5 memory as a result.
But I'm sure the hysteria around that isn't helping prices come back down either.