NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
New Apple Silicon M4 and M5 HiDPI Limitation on 4K External Displays (smcleod.net)
nuker 3 hours ago [-]
Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.

Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)

https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...

arvinsim 18 minutes ago [-]
Didn't know that. This probably explained why MacOS felt sluggish compared to my Windows PC even though I was using them on the same 144hz monitor.
smcleod 2 hours ago [-]
I don't expect emails to get through to busy CEOs of huge companies like Apple unless you're really lucky and they make it through some automation, but I have dropped him an email just in case. I guess you never know.
MikeNotThePope 2 hours ago [-]
He has an army of people that would read his emails. A truly important one should get read.

You could always try calling, too! I cold called Marc Benioff at Salesforce and he actually picked up the phone.

krackers 1 hours ago [-]
This needs a story. What did you say to him?
harikb 1 hours ago [-]
He was told, he had a call from Pope
not_your_vase 1 hours ago [-]
Heh, classic Mike (not the Pope one)
nuker 21 minutes ago [-]
Helps if there are support case numbers with attempted diags, maybe.
extr 56 minutes ago [-]
Just emailed him. Ridiculous issue.
FireBeyond 2 hours ago [-]
No, it didn’t get fully fixed.

Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.

If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).

I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.

nuker 2 hours ago [-]
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
FireBeyond 2 hours ago [-]
I was using 2 27" LG 27GM950-Bs (IIRC), that could do up to 165Hz and VRR on a 2019 cheesegrater Mac Pro, wasn't the cables, or the monitors, or the card.

People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").

nerdsniper 3 hours ago [-]
Thank you
skullone 3 hours ago [-]
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail. I wonder if Apple is doing this on purpose except for their own displays.
NBJack 2 hours ago [-]
It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.

I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.

I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.

Sir_Twist 43 minutes ago [-]
https://news.ycombinator.com/item?id=17477526

This reminds me of this comment, which I feel is a somewhat unsatisfying explanation, given that despite these difficulties, Windows somehow makes it work.

wronglebowski 3 hours ago [-]
Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
MarcelOlsz 3 hours ago [-]
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
jakehilborn 1 hours ago [-]
You may be able to get this working using PBP and 2 cables without virtual displays. This is my write up for using HiDPI@120hz for two 57” G9s on my M2 MacBook. https://www.reddit.com/r/ultrawidemasterrace/s/VrBLFDxYzg
moeadham 2 hours ago [-]
betterdisplay is a life saver
smcleod 2 hours ago [-]
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
dostick 55 minutes ago [-]
The post reminded me how I investigated a similar issue having no idea. Using Claude or GPT to investigate this kind of hardware issue is fast and easy. It gives you next command to try and then next one and you end up with similar summary. I wouldn’t be surprised that author didn’t know anything about displays before this.
Someone 26 minutes ago [-]
> This aligns with our findings. The M4/M5 DCP firmware implements a conservative framebuffer pre-allocation strategy that:

> Caps the HiDPI backing store to approximately 1.75x the native resolution (6720x3780 for 3840x2160 native), rather than the 2.0x needed for full HiDPI (7680x4320)

So, that could be an off by one bug? That might be testable by tweaking the system to think the display supports an even higher resolution.

Also, instead of messing with the Display Override Plist, patching drivers, etc, did they try using the “Advanced…” button in the “Displays” UI? They don’t mention they did.

For me (with a 27 inch 4K monitor not on M4 or M5) that replaces the 5-way choice by one with a list of 11 choices. With the then appearing “Show all resolutions” toggle, that becomes 18.

smcleod 17 minutes ago [-]
I tried quite a few things in the "Advanced" section of BetterDisplay - is there something specifically you're suggesting?
29 minutes ago [-]
arjie 3 hours ago [-]
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
smcleod 35 minutes ago [-]
Hey, thanks I hadn't tried screenresolution but that seems to simply set the resolution and refresh rate without controlling the scaling which is what is needed for configuring the HiDPI mode scaling.
tgma 60 minutes ago [-]
This might be a dumb question: Is the author looking to run 4k display at HiDPI 8k framebuffer and then downscale? What's the advantage of doing so versus direct 4k low-DPI? Some sort of "free" antialiasing?
LarsAlereon 41 minutes ago [-]
From what I understand, the main goal is to fix the problem that non-native (1:1 pixel mapping) resolutions and scaling look worse than native. This is a problem when you ship high-dpi displays that need UI scaling in order for things to be readable. Apple's solution was to render everything at a higher, non-native resolution so that images were always downscaled to fit the display.

So to oversimplify, Windows can have a problem where if you are running 1.5X scaling so text is big enough, you can't fit 4K of native pixels on a 4K display so videos are blurry. If instead you were rendering a scaled image to a 6K framebuffer and then downscaling to 4K, there would be minimal loss of resolution.

mono442 50 minutes ago [-]
That's exactly what he is trying to do. It's really just because MacOS has bad text rendering when graphics aren't rendered at 2x scale.
LuxBennu 3 hours ago [-]
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
wmf 3 hours ago [-]
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
smcleod 2 hours ago [-]
In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
petersellers 1 hours ago [-]
In macOS display settings, what scaling mode are you using? This bug appears to only affect 4K monitors that are configured to use the maximum amount of screen space (which makes text look uncomfortably tiny unless you have a very large monitor). Most people run at the default setting which gives you the real estate of a 1080p screen at 2x scale, hence the "not normal" part of this configuration.

Actually, I don't even think it's possible to run HiDPI mode at the native resolution scale from within the macOS settings app, you'd need something like `Better Display` to turn it on explicitly.

smcleod 58 minutes ago [-]
If you use the middle screen scaling you're given absolutely huge UI elements and it's the case for the inbuild 16" screen as well as external displays but when you get up to 32" displays it's almost comical how large the UI is on the middle / default setting.
petersellers 51 minutes ago [-]
Yeah, on larger monitors it's more common to run at the monitor's native resolution without scaling but even so macOS will not turn on HiDPI mode - you'd still need to do this explicitly via another app (I didn't even know it was possible to turn on HiDPI mode at native scaling until reading this article)
1 hours ago [-]
2 hours ago [-]
phonon 2 hours ago [-]
Isn't that just 2x supersampling? If you want "perfect" antialiasing that's the minimum you need, no?
wmf 2 hours ago [-]
Yes, it is supersampling but historically almost no one runs that way.
sgerenser 2 hours ago [-]
I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
mlyle 2 hours ago [-]
Because it's a decent way to get oversampling.
eptcyka 1 hours ago [-]
Because Apple no longer implements subpixel rendering for fonts?
Rohansi 1 hours ago [-]
Supersampling the entire framebuffer is a bad way to anti-alias fonts. Especially since your font rendering is almost certainly doing grayscale anti-aliasing already, which is going to look better than 2x supersampling alone. And supersampling will not do subpixel rendering.
wpm 2 hours ago [-]
This is what us proles on third-party monitors have to do to make text look halfway decent. My LG DualUps (~140ppi if I recall) run at 2x of a scaled resolution to arrive at roughly what would be pixel-doubled 109ppi, which is the only pixel density the UI looks halfway decent at. It renders an 18:16 2304 x something at 2x, scaled down by 2.

It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadable aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.

I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.

I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.

toxik 1 hours ago [-]
Pedantry: 18:16 is the same as 9:8 since it's a ratio.
NBJack 2 hours ago [-]
To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.

Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.

mil22 3 hours ago [-]
This would be even more compelling if you included screenshots with magnified detail insets showing the text blur.
smcleod 2 hours ago [-]
Thanks for the feedback, I'll try to take some photos, it's not an easy thing to do accurately without a good camera setup, but I'll reply here after work if I get something setup and added to the post.
pier25 2 hours ago [-]
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro (on Sequoia). I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.

Anyway I will run the diagnostic commands and see what I get.

bsimpson 3 hours ago [-]
Wouldn't HiDPI be 1080p@2x? Is that still available?
tom_ 3 hours ago [-]
Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?

(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)

kalleboo 3 hours ago [-]
> what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?

Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.

Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.

halapro 2 hours ago [-]
Feels like a huge power loss just to get slightly better text. You slow rendering down 4x for this
wpm 2 hours ago [-]
Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
metabagel 1 hours ago [-]
So, how do I actually set this for my Mac, or is this something which each application may or may not do?
TheTon 3 hours ago [-]
Yes 1920x1080@2x absolutely works on M4. I use this mode all day every day.
TheCoreh 3 hours ago [-]
Yeah if I understand it correctly, this is more like 2160p@2x which is... unusual?
TheTon 3 hours ago [-]
Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).

You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!

wpm 2 hours ago [-]
Because 1x mode has no subpixel antialiasing and thus looks absolutely terrible.

I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.

TheTon 2 hours ago [-]
Then complain about that. That would make a much more sensible blog post and discussion. Asking for a crazy workaround to a sane problem isn't a great way to get good results, especially with Apple. Beyond the obvious performance pitfall, this scale up to scale down approach will also destroy the appearance of some controls. There is some UI that aims for 1px lines on hidpi modes that will get lost if you do this. It's hardly a perfect mode.
wpm 56 minutes ago [-]
The crazy workaround only needs to be done because of what Apple did probably around a decade ago and probably already heard a bunch of crying about and didn't care. No one removed subpixel antialiasing on their own, we do this bullshit because Apple forced us to to make text look halfway decent.
armadyl 3 hours ago [-]
Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
compounding_it 2 hours ago [-]
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.

24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)

Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.

Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.

I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.

extr 1 hours ago [-]
If you actually care about this stuff you are going to run something like https://github.com/waydabber/BetterDisplay which easily allows for HiDPI @ 4K resolution, it does not "look bizarre" or "require fractional scaling". This is what the OP is about. I do the same thing, I run native res w/ HiDPI on a 27" 4K screen as my only monitor, works great.
smcleod 33 minutes ago [-]
Unfortunately BetterDisplay cannot set HiDPI @ 4K on the M5 machines - that was the first thing I tried.
danny8000 2 hours ago [-]
32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
jbellis 2 hours ago [-]
So MacOS supports only a handful of low dpi resolutions and high dpi must be an integer multiple of one of those?
wmf 2 hours ago [-]
It doesn't have to be but it's really designed to run at exactly 2x scale.
stefanfisk 2 hours ago [-]
What makes you say that? Unless I am mistaken, it’s only the Pro models who run at 2x by default.
tern 1 hours ago [-]
Yup, 27" 4k with a Mac is truly awful. Don't do it. Get a 5k display.
extr 1 hours ago [-]
Disagree completely. Works great for me.
mkl 58 minutes ago [-]
27" 4k is totally fine on Windows 11 (not a gamer). Everything is sharp at 150% scale.
smcleod 2 hours ago [-]
For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
shiroiuma 1 hours ago [-]
I have dual 27" monitors, both at work and at home. At work, they're 4K monitors, because that's all they have in this size for some reason (LG if it makes a difference). At home, my own monitors are ASUS ProArt 1440p monitors. I run Linux in both places.

I really like my 1440p monitors at home more than the 4K monitors at work. At work, I'm always dealing with scaling and font size issues, but at home everything looks perfect. So I think you're onto something here: 1440p just seems to be a better resolution on a 27" panel.

tmsh 24 minutes ago [-]
You might need a higher quality usb cable. I run ok with LG 5k display and MacBook Pro m4 max.
keyle 3 hours ago [-]
This is the sort of Apple gotchas that really upset me.

They've got a good thing going, but they keep finding ways to alienate people.

compounding_it 2 hours ago [-]
Their suggestion : get an Apple monitor that we just launched.
whatever1 3 hours ago [-]
How did none of the Apple devs notice this? 4k 32" inch is the industry standard for HiDPI monitors.
MBCook 3 hours ago [-]
Apple doesn’t make an 4k external monitor.

They’re likely all on Studio Displays.

cosmic_cheese 3 hours ago [-]
And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.

Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.

jiveturkey 2 hours ago [-]
Not in the Apple world, and this article is centered on Apple.

https://bjango.com/articles/macexternaldisplays/

  - 24" you need 4k
  - 27" you need 5K.
  - 32" you need 6k.
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.
NBJack 2 hours ago [-]
This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.

Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".

smcleod 2 hours ago [-]
Totally agree with those resolution suggestions. Personally I have a 32" 4k, I wanted a 5k or 6k back then (just too expensive) - but now I wish I had just got a 27" which is better suited to 4k - regardless it was a LOT better on the M2 Max with HiDPI working.
robertoandred 3 hours ago [-]
Don’t think I’d call 4K at 32” high dpi.
whatever1 54 minutes ago [-]
I agree with you. It's my IT who disagrees.

But to be fair, until last year there were no retina monitors in the market except the Apple ones. In 2025, the tides turned, there are now way more options both for 5k and 6k retina displays.

Gigachad 2 hours ago [-]
Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.

The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.

whatever1 2 hours ago [-]
In layman terms, for some UI scaling options, text is rendered blurry by M4/M5 Macs.
Gigachad 2 hours ago [-]
Right, I just went though all of the scale options on my M4 with 4k monitor and none of them rendered blurry. Might be a very situational bug. Doesn't seem as widespread as the title makes out to be.
pier25 3 hours ago [-]
Is this for specific verisons of macOS?

The article doesn't mention it.

smcleod 2 hours ago [-]
It's in the environment and test setup (26.4 at the moment, but it is the same across all 26.x releases I've tried).
pier25 2 hours ago [-]
I'm still on Sequoia and haven't noticed these issues on my M4 Pro.
smcleod 30 minutes ago [-]
Out of interest - whats the output of `system_profiler SPDisplaysDataType`?
jval43 1 hours ago [-]
Not again! Had these issues with 2016 Macbook Pro (the touchbar one).

That one also wasn't a hardware limitation as it ran my displays just fine in bootcamp, but macOS would just produce fuzzy output all the way.

It's infuriating.

spoaceman7777 1 hours ago [-]
Yep. Apple sells 5k displays, which work fine.

Just another case of Apple intentionally going against established open standards to price gouge their users.

I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.

High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.

If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)

lovegrenoble 2 hours ago [-]
They do this on purpose ...
chaostheory 1 hours ago [-]
What are they doing with MacOS? Is this due to VisionOS?
3 hours ago [-]
jiveturkey 2 hours ago [-]
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440. Which is what I'd expect.

If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.

smcleod 2 hours ago [-]
I believe it will, it won't be until you push up to an 8k display that you'll get the old level of scaling back (could be wrong though as I don't have a way to test this).
PedroBatista 3 hours ago [-]
Now I know I was not crazy and the "cheap" 4K screen I bought a couple months ago doesn't actually suck.

Tim Apple's Apple has been fu#$%& me again..

comex 3 hours ago [-]
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
Aurornis 2 hours ago [-]
Agree. I started reading the article until I realized it wasn’t even self-coherent. Then I got to the classic two-column table setup and realized I was just reading straight LLM output.

There might be a problem but it’s hard to know what to trust with these LLM generated reports.

I might be jaded from reading one too many Claude-generated GitHub issues that look exactly like this that turned out to be something else.

smcleod 32 minutes ago [-]
Parts of it were pasted from my Claude Code Logs, Parts were written by me - and the table - that was me!
xbar 3 hours ago [-]
I think you are probably right--it's a real problem.

As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.

imrozim 1 hours ago [-]
[dead]
7e 2 hours ago [-]
[flagged]
faangguyindia 1 hours ago [-]
what's a codeslave? I couldn't get it from the username.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 06:09:57 GMT+0000 (Coordinated Universal Time) with Vercel.