Hacker Newsnew | past | comments | ask | show | jobs | submit | TheTon's commentslogin

Yes 1920x1080@2x absolutely works on M4. I use this mode all day every day.

Yeah that would work, that's just 2k HiDPI, not 4k HiDPI.

Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).

You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!


Because 1x mode has no subpixel antialiasing and thus looks absolutely terrible.

I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.


Then complain about that. That would make a much more sensible blog post and discussion. Asking for a crazy workaround to a sane problem isn't a great way to get good results, especially with Apple. Beyond the obvious performance pitfall, this scale up to scale down approach will also destroy the appearance of some controls. There is some UI that aims for 1px lines on hidpi modes that will get lost if you do this. It's hardly a perfect mode.

Scaling up before scaling down is a sensible approach, especially when you want to run at a slightly lower scaled resolution. It worked fine up to M3 Pro. So whatever you think of it, it's something that worked fine for many years and suddenly doesn't anymore on the newer MacBooks.

The crazy workaround only needs to be done because of what Apple did probably around a decade ago and probably already heard a bunch of crying about and didn't care. No one removed subpixel antialiasing on their own, we do this bullshit because Apple forced us to to make text look halfway decent.

I can tell you that inside Apple, they have something called the standard question, and it goes something like this: “What are you really trying to do?”

If you haven’t personally filed a bug report at feedbackassistant.apple.com, I recommend that you do so. Title it something like “Poor text quality on LoDPI display”, file it in the Displays component, and in the description explain what you’re seeing. Here’s the critical part: you want to attach images showing what looks bad and what looks better, and why the current behavior is a regression and since when (earlier macOS versions for subpixel AA, earlier GPUs for 2x 1x mode). If possible, use the same display, but get an image of historical macOS when it had subpixel AA, macOS with this 2x 1x mode, Windows 11, and then current macOS at the standard 1x mode. I’m not sure screenshots will capture it, you’ll probably need to use a camera.

I know how they think at Apple. If you come at them with a bug written like OP’s blog, they are going to say it behaves as designed. To get them to fix something, you have to be descriptive about what the real problem actually is: the text rendering looks bad. Then you have to explain what used to work and what you’ve tried and bring receipts (the images). Don’t write a novel; write the shortest bug that fully describes the real problem, includes all of the relevant information including macOS versions, hardware info, and display model, and the evidence of the problem, but don’t include a bunch of emotional text or extraneous information (like SkyLight framework reverse engineering stuff).

Now you might say, “I’m not Apple’s free QA”, and you’ll be right. But, consider that you’re spending this time complaining about a problem online and you’ve spent good money on a display you’d like to use and it’s not working the way you want. Fair or not, you care about the outcome, and at this point you might as well take my advice and file a strong bug to make your case. Dupes help, OP should file one too, but be descriptive about the real problem, not proscriptive about bringing back the crazy workaround that they likely intentionally disabled because on the face of it, it makes no sense.

I do know that they read user bugs in the Displays component, because I have filed a few in there recently and they got fixed and they followed up with me about where they were fixed.


Without it - running 2160p@1x results in very low quality text rendering on macOS.

Exactly. Their holy war on the App Stores blunted Fortnite’s momentum at its apogee.

On one hand, I admire their chutzpah. The App Store model has weighed down the entire software industry and has prevented entire categories of new products from growing out of infancy due to anticompetitive practices. Everyone, Apple and Google included, would actually be better off without the App Stores in their present form, and I’d love to see them weakened or eliminated.

But on the other hand, Epic actually accomplished very little in their war, and nowhere near what being unavailable on mobile platforms for years cost them.

Additionally, their refusal to go after Xbox, PlayStation, and Switch never made any sense to anyone except for those with a financial interest in those arrangements. The rest of us were just confused — the console App Stores are the exact same model as the mobile App Stores.

I suspect Epic’s actual reason for not going after the consoles was a bit of realpolitik or cowardice depending on how you look at it. They couldn’t afford to be locked out of the mobile and console stores at the same time, so they invented some tortured rationale for why they could pay the console vendors their 30% but not the mobile vendors. But, this muddied their message and they came up mostly empty handed in the end, and here we are today.


On the other hand, the underlying functionality for sandboxing is used heavily throughout the OS, both for App Sandboxes and for Apple’s own system processes. My guess is sandbox-exec is deprecated more because it never was adequately documented rather than because it’s flawed in some way.


> the underlying functionality for sandboxing is used heavily throughout the OS, both for App Sandboxes and for Apple’s own system processes.

The security researchers will leverage every part of the OS stack to bypass the sandbox in XNU which they have done multiple times.

Now, there is a good reason for them to break the sandbox thanks to the hype of 'agents'. It could even take a single file to break it. [0]

> My guess is sandbox-exec is deprecated more because it never was adequately documented rather than because it’s flawed in some way.

You do not know that. I am saying that it has been bypassed before and having it being used all over the OS doesn't mean anything. It actually makes it worse.

[0] https://the-sequence.com/crashone-cve-2025-24277-macos-sandb...


You could apply this same reasoning to any feature or technology. Yes there could be a zero day nobody knows about. We could say that about ssh or WebKit or Chrome too.

I hear what you're saying about the deprecation status, but as I and others mentioned, the fact that the underlying functionality is heavily used throughout the OS by non deprecated features puts it on more solid footing than a technology that's an island unto itself.


I admit I started reading with some skepticism. It didn't read like PR, so I assumed I was reading fanfic. By the midpoint, she managed to convince me otherwise.

I think the author is walking a tightrope between convincing the reader that she wrote this herself and that there's more depth to her than what we see on stage or in pop media. Writing this blog is definitely a tougher assignment than doing podcast interviews or behind the scenes videos.

You are right, of course, a good editor could make this better, but I think she's deliberately avoiding that here. A pop star is unwise to fire a good producer without a better replacement, but sometimes they have to bring out the piano and do an acoustic performance live.


There isn’t a single one way to be a dedicated gamer.

Inevitably everyone has finite time and access to games and has to make choices about what to play.

As a Mac guy, I always found the game platform wars weird because even on the weakest gaming platform there are still more good games than anyone can individually play. And even on Windows, probably the strongest gaming platform, you’re still missing out on many significant games.

I totally understand buying a system because it has some game that you absolutely must play. I bought an OG Xbox back in the day because I thought I desperately needed to play Deus Ex: Invisible War when it didn’t come to Mac. Got burned on that one, but at least I had Halo before it came to Mac (and was in the end much better there than on Xbox due to expanded online multiplayer).

What I actually don’t get is folks who have to play the hot game of the week every week. Just seems expensive in terms of money, time, and space for different systems, and you only scratch the surface of the games.


This is a big miss for me. I can’t use my TVs 120Hz VRR mode without HDMI 2.1.

I realize the Xbox Series X is beleaguered at this point, but apart from playing games that are on Steam but not Xbox, I can’t see why I would prefer the Steam Machine.


After commenting i looked up the actual capabilities of the port and it turns out while the port is officially only HDMI 2.0 it actually still supports 120Hz, HDR and VRR anyway. So basically it only doesn't support Display Stream Compression for 144Hz and beyond.

I quickly tested this by connecting my PC running Linux with a RX 6800 to my TV (LG C4). 120Hz, VRR and HDR were all available.


At 4K? Or are you limited to a lower resolution due to bandwidth constraints?


Yes, 4K120Hz! My TV could do 144Hz but i couldn't select it so 4K120 seems to be the limit.


Try for yourself. I get 4k120Hz when connecting my laptop directly via HDMI.


Yeah I have tried it for myself. I am limited to 4K60 when using the HDMI 2.0 port on either my M1 Mac mini or M1 Pro MacBook Pro and LG B2 TV. I do get 4K120 with VRR with newer Macs with HDMI 2.1 as well as my Xbox Series X. It has been my understanding that 4K120 with HDR and VRR requires HDMI 2.1, which is why those HDMI 2.0 limited systems don’t work. Not having a Steam Machine myself, I would assume its HDMI 2.0 port would be similarly limited.

Edit: I should add, I do get 4K120 VRR and HDR on the M1 Macs when connected to a monitor via Thunderbolt or Thunderbolt to DisplayPort adapter, and I would expect a Steam Machine to be similar using DisplayPort, but my TV only has HDMI input and so can’t work in this mode (and a Thunderbolt to HDMI adapter doesn’t work either).


I get 4k120Hz when connecting the output from a HDMI 2.0 port on my laptop to the HDMI 2.1 port on my TV (Sony TV, it has 4 HDMI ports, but for some reason only ports 3 and 4 support HDMI 2.1, 120Hz, VRR).


As a kid, I was marginally decent at competitive math. Not good like you think of kids who dominate those type of competitions at a high level, but like I could qualify for the state competition type good.

What I was actually good, or at least fast at, was TI-Basic, which was allowed in a lot of cases (though not all). Usually the problems were set up so you couldn’t find the solution using just the calculator, but if you had a couple of ideas and needed to choose between them you could sometimes cross off the wrong ones with a program.

The script the author gives isn’t a proof itself, unless the proposition is false, in which case a counter example always makes a great proof :p


I used to do the same thing. I'd scan for problems on the test amenable to computational approaches and either pull up one of my custom made programs or write one on the spot and let it churn in the background for a bit while I worked on other stuff without the calculator.


But games are full fledged GUI apps. At a minimum they have a window.

It’s really unclear what it means to support old games but not old apps in general.

I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).

So then why say only games when the minimum to support the games probably covers a lot of non games too?

I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?


Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.

That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.


I don’t agree. My point is their collective footprint in terms of the macOS API surface (at least as of 2019 or so) is pretty big. I’m not just speculating here, I work in this area so I have a pretty good idea of what is used.


Could you give examples at least of what you think that big collective footprint might include?

Bear in mind that a large chunk of Mac gaming right now that needs translation are windows games translated via crossover.


As I said in my first comment, it's at least Cocoa (Foundation + AppKit), AVFoundation, Metal, OpenGL, and then all of the lower level frameworks and libraries those depend on (which may or may not be used directly by individual games). If you want a concrete example from something open source, go look at what SDL depends on, it's everything I listed and then some. It's also not uncommon for games to have launchers or startup windows that contain additional native UI, so assume you really do need all of AppKit, you couldn't get away with cutting out something like NSTableView or whatever.

So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.


With the exception of AVFoundation, I’d covered all of those in my comments. That’s not a lot of surface area. Games are typically not using a significant portion of AppKit beyond what I already mentioned, and AVFoundation is likely also a very thin wrapper that is maintainable.

I’m assuming Apple isn’t going to arbitrarily restrict what runs but will remove things to just the subset that they believe are needed for games such that other stuff just implicitly won’t work.


Is it practical for Apple to produce a set of frameworks for Intel that run some useful set of old games but that do not run any useful set of non game software?

I grant it’s probably possible to do, but I think that is a lot more work and more error prone than just continuing to ship the major frameworks as they were.

From Apple’s perspective I’m sure they have a few big goals here:

1. Encourage anyone who wants to continue offering software on Mac to update their builds to include arm64.

2. Reduce download size, on disk size, and memory use of macOS.

3. Reduce QA burden of testing ancient 3rd party software.

These are also the same motivations Apple had when they eliminated 32 bit Intel and when they eliminated Rosetta 1, but they were criticized especially for leaving behind game libraries.

Arguably, arbitrarily restricting what runs gets them the biggest slice of their goals with the minimum work. Devs are given the stick. People typically only play 1 game at a time and then quit it, so there isn’t a bunch of Intel code in RAM all the time because of a few small apps hanging out, and they have less to test because it’s a finite set of games. It just will chafe because if they do that then you know that some unblessed software could run but Apple is just preventing it to make their lives easier.


> Is it practical for Apple to produce a set of frameworks for Intel that run some useful set of old games but that do not run any useful set of non game software?

They already have the frameworks supporting intel. They can just start pruning away.

Some teams will draw the short straw of what needs to continue being supported, but it’s likely a very small subset of what they already maintain today.


If you'd like to see an interesting parallel, go look at how Microsoft announced supporting DirectX 12 on Windows 7 for a blessed apps list - basically because Blizzard whined hard enough and was a big enough gorilla to demand it.


That's one implementation, yeah, just have a list somewhere of approved software and make an artificial limitation. But their announcement is so vague, it's hard to say.

And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?


Ditto. Happily using 7, but if they ever break it I’m switching to Apple Passwords.


I did that last year and haven’t looked back. And I can share passwords with my family without paying an arm and a leg.

Pity. I used 1P for many, many years and recommended it to everyone I knew. I feel like it’s completely lost the plot, though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: