NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
AMD will bring its "Ryzen AI" processors to standard desktop PCs for first time (arstechnica.com)
snovv_crash 9 minutes ago [-]
Hoe much dedicated cache do these NPUs have? Because it's easy enough to saturate the memory bandwidth using the CPU for compute, never mind the GPU. Adding dark silicon for some special operations isn't going to make out memory bandwidth faster.
tuukkah 20 minutes ago [-]
Meanwhile, the corresponding "non-standard" desktop PC is the Framework Desktop, which with the Ryzen AI Max+ 395 can use 120GB of its 128GB RAM for the GPU: How to Run a One Trillion-Parameter LLM Locally: An AMD Ryzen™ AI Max+ Cluster Guide https://www.amd.com/en/developer/resources/technical-article...
Buttons840 38 minutes ago [-]
Do we expect special AI processors to diverge from GPUs? Like, processors that can do parallel neural network computations but cannot draw graphics?
jiggawatts 35 seconds ago [-]
Yes.

Even the latest NVIDIA Blackwell GPUs are general purpose, albeit with negligible "graphics" capabilites. They can run fairly arbitrary C/C++ code with only some limitations, and the area of the chip dedicated to matrix products (the "tensor units") is relatively small.

Conversely, the Google TPUs dedicate a large area of each chip to pure tensor ops, hence the name.

This is partly why Google's Gemini is 4x cheaper than OpenAI's GPT5 models to server.

Jensen Huang has said in recent interviews that he stands by the decision to keep the NVIDIA GPUs more general purpose, because this makes them flexible and able to be adapted to future AI designs, not just the current architectures.

That may or may not pan out.

I strongly suspect that the "winner" of this race -- for inference -- will be a chip with about 80% of its area dedicated to tensor units, very little onboard cache, and model weights streamed in from High Bandwidth Flash (HBF). This would be dramatically lower power and cost compared to the current hardware that's typically used.

c0balt 21 minutes ago [-]
That is already the case with datacenter "GPUs". A A100, MI300 or Intel PVC/Gaudi does not have useful graphics performance nor capabilities. Coprocessors ala NPU/VPU are also on the rise again for CPUs.
dagmx 32 minutes ago [-]
That’s already the norm no?

Pretty much every hardware vendor has an NPU

iso-logi 41 minutes ago [-]
8 Core/16 Thread, boosting up to 5.1GHz with iGPU would be pretty neat for a Plex Server or Proxmox Server with a few VMs.
cebert 3 days ago [-]
AMD marketing is hoping the “AI” branding is a positive. Antidotally, I know many consumers who are not sold on AI. This branding could actually hurt sales.
aljgz 27 minutes ago [-]
We are dealing with a hype, but the reality is that AI would change everything we do. Local models will start being helpful in [more] unobtrusive ways. Machines with decent local NPUs would be usable for longer before they feel too slow.
vbezhenar 11 minutes ago [-]
For some people maybe. I don't want to use local AI and NPU will be dead weight for me. Can't imagine a single task in my workflow that would benefit from AI.

It's similar to performance/effiency cores. I don't need power efficiency and I'd actually buy CPU that doesn't make that distinction.

orbital-decay 3 minutes ago [-]
Also similar to GPU + CPU on the same die, yet here we are. In a sense, AI is already in every x86 CPU for many years, and you already benefit from using it locally (branch prediction in modern processors is ML-based).
wood_spirit 14 minutes ago [-]
So I’ve got a lot warmer to believing that AI can be a better programmer than most programmers these days. That is a low bar :). The current approach to AI can definitely change how effective a programmer is: but then it is up to the market to decide if we need so many programmers. The talk about how each company is going to keep all the existing programmers and just expect productivity multipliers is just what execs are currently telling programmers; that might change when the same is execs are talking to shareholders etc.

But does this extrapolate to the current way of doing AI being in normal life in a good way that ends up being popular? The way Microsoft etc is trying to put AI in everything is kinda saying no it isn’t actually what users want.

I’d like voice control in my PC or phone. That’s a use for these NPUs. But I imagine it is like AR- what we all want until it arrives and it’s meh.

snovv_crash 11 minutes ago [-]
When I interview people for a job I'm not looking to hire an average programmer, though.
kijin 7 minutes ago [-]
They can just buy a regular Ryzen 9000 series CPU, then. Maybe add a real graphics card if they're into gaming.
skirmish 9 minutes ago [-]
Indeed, I was buying a laptop for my wife, and she was viscerally against "Ryzen AI": I don't want a CPU with builtin AI to spy on my screen all the time!
himata4113 45 minutes ago [-]
I'd actually love to have an NPU that isn't useless on my 285k.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 06:39:29 GMT+0000 (Coordinated Universal Time) with Vercel.