19 comments

  • nuker 1 hour ago
    Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.

    Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)

    https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...

    • smcleod 8 minutes ago
      I don't expect emails to get through to busy CEOs of huge companies like Apple unless you're really lucky and they make it through some automation, but I have dropped him an email just in case. I guess you never know.
    • FireBeyond 50 minutes ago
      No, it didn’t get fully fixed.

      Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.

      If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).

      I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.

      • nuker 44 minutes ago
        Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
        • FireBeyond 17 minutes ago
          I was using 2 27" LG 27GM950-Bs (IIRC), that could do up to 165Hz and VRR on a 2019 cheesegrater Mac Pro, wasn't the cables, or the monitors, or the card.

          People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").

    • nerdsniper 1 hour ago
      Thank you
  • skullone 1 hour ago
    I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail. I wonder if Apple is doing this on purpose except for their own displays.
    • NBJack 33 minutes ago
      It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.

      I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.

      I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.

  • wronglebowski 1 hour ago
    Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
    • MarcelOlsz 1 hour ago
      I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
    • smcleod 21 minutes ago
      Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
  • LuxBennu 1 hour ago
    Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
  • arjie 1 hour ago
    I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
  • wmf 1 hour ago
    This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
    • smcleod 19 minutes ago
      In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
    • phonon 39 minutes ago
      Isn't that just 2x supersampling? If you want "perfect" antialiasing that's the minimum you need, no?
      • wmf 29 minutes ago
        Yes, it is supersampling but historically almost no one runs that way.
    • NBJack 25 minutes ago
      To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.

      Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.

    • sgerenser 49 minutes ago
      I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
      • mlyle 10 minutes ago
        Because it's a decent way to get oversampling.
    • wpm 18 minutes ago
      This is what us proles on third-party monitors have to do to make text look halfway decent. My LG DualUps (~140ppi if I recall) run at 2x of a scaled resolution to arrive at roughly what would be pixel-doubled 109ppi, which is the only pixel density the UI looks halfway decent at. It renders an 18:16 2304 x something at 2x, scaled down by 2.

      It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadbale aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.

      I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.

      I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.

  • pier25 44 minutes ago
    I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro. I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.

    Anyway I will run the diagnostic commands and see what I get.

  • mil22 1 hour ago
    This would be even more compelling if you included screenshots with magnified detail insets showing the text blur.
    • smcleod 17 minutes ago
      Thanks for the feedback, I'll try to take some photos, it's not an easy thing to do accurately without a good camera setup, but I'll reply here after work if I get something setup and added to the post.
  • compounding_it 39 minutes ago
    The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.

    24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)

    Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.

    Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.

    I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.

    • danny8000 30 minutes ago
      32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
    • smcleod 22 minutes ago
      For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
    • jbellis 34 minutes ago
      So MacOS supports only a handful of low dpi resolutions and high dpi must be an integer multiple of one of those?
      • wmf 32 minutes ago
        It doesn't have to be but it's really designed to run at exactly 2x scale.
        • stefanfisk 23 minutes ago
          What makes you say that? Unless I am mistaken, it’s only the Pro models who run at 2x by default.
  • bsimpson 1 hour ago
    Wouldn't HiDPI be 1080p@2x? Is that still available?
    • tom_ 1 hour ago
      Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?

      (I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)

      • kalleboo 1 hour ago
        > what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?

        Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.

        Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.

        • halapro 46 minutes ago
          Feels like a huge power loss just to get slightly better text. You slow rendering down 4x for this
          • wpm 27 minutes ago
            Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
      • TheTon 1 hour ago
        Yes 1920x1080@2x absolutely works on M4. I use this mode all day every day.
    • TheCoreh 1 hour ago
      Yeah if I understand it correctly, this is more like 2160p@2x which is... unusual?
      • TheTon 1 hour ago
        Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).

        You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!

        • wpm 26 minutes ago
          Because 1x mode has no subpixel antialiasing and thus looks absolutely terrible.

          I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.

    • armadyl 1 hour ago
      Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
  • keyle 57 minutes ago
    This is the sort of Apple gotchas that really upset me.

    They've got a good thing going, but they keep finding ways to alienate people.

    • compounding_it 47 minutes ago
      Their suggestion : get an Apple monitor that we just launched.
  • comex 1 hour ago
    Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
    • Aurornis 7 minutes ago
      Agree. I started reading the article until I realized it wasn’t even self-coherent. Then I got to the classic two-column table setup and realized I was just reading straight LLM output.

      There might be a problem but it’s hard to know what to trust with these LLM generated reports.

      I might be jaded from reading one too many Claude-generated GitHub issues that look exactly like this that turned out to be something else.

    • xbar 1 hour ago
      I think you are probably right--it's a real problem.

      As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.

  • lovegrenoble 15 minutes ago
    They do this on purpose ...
  • pier25 1 hour ago
    Is this for specific verisons of macOS?

    The article doesn't mention it.

    • smcleod 17 minutes ago
      It's in the environment and test setup (26.4 at the moment, but it is the same across all 26.x releases I've tried).
  • whatever1 1 hour ago
    How did none of the Apple devs notice this? 4k 32" inch is the industry standard for HiDPI monitors.
    • MBCook 1 hour ago
      Apple doesn’t make an 4k external monitor.

      They’re likely all on Studio Displays.

      • cosmic_cheese 58 minutes ago
        And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.

        Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.

    • jiveturkey 34 minutes ago
      Not in the Apple world, and this article is centered on Apple.

      https://bjango.com/articles/macexternaldisplays/

        - 24" you need 4k
        - 27" you need 5K.
        - 32" you need 6k.
      
      Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.
      • smcleod 13 minutes ago
        Totally agree with those resolution suggestions. Personally I have a 32" 4k, I wanted a 5k or 6k back then (just too expensive) - but now I wish I had just got a 27" which is better suited to 4k - regardless it was a LOT better on the M2 Max with HiDPI working.
      • NBJack 19 minutes ago
        This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.

        Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".

    • robertoandred 56 minutes ago
      Don’t think I’d call 4K at 32” high dpi.
    • Gigachad 48 minutes ago
      Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.

      The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.

      • whatever1 32 minutes ago
        In layman terms, for some UI scaling options, text is rendered blurry by M4/M5 Macs.
        • Gigachad 12 minutes ago
          Right, I just went though all of the scale options on my M4 with 4k monitor and none of them rendered blurry. Might be a very situational bug. Doesn't seem as widespread as the title makes out to be.
  • jiveturkey 20 minutes ago
    TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440.

    If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.

    • smcleod 18 minutes ago
      I believe it will, it won't be until you push up to an 8k display that you'll get the old level of scaling back (could be wrong though as I don't have a way to test this).
  • PedroBatista 1 hour ago
    Now I know I was not crazy and the "cheap" 4K screen I bought a couple months ago doesn't actually suck.

    Tim Apple's Apple has been fu#$%& me again..

  • 7e 39 minutes ago
    Apple software is written by codeslaves under constant fear of deportation. They’re cheap and they can’t do software worth a damn.