On: https://blog.aktsbot.in/img/stem-darkening-on.png
Personally, I never had issues with fonts on any OS that much except when I connected my M1 MacBook to a 1080p monitor, then it felt like the fonts had no anti-aliasing at all.
I sometimes switch to a bitmap font like Fixedsys Excelsior or GNU Unifont when using MacOS with a low-resolution monitor to compensate (with antialiasing off so the bitmap font looks crisp).
Also, JetBrains Mono somehow looks good on lowres screens even though it’s not a bitmap font, it seems to not blur as much as other fonts when it gets antialiased.
This will then mean making the subpixel anti-aliasing algorithm aware of different subpixel layouts. And this ought to be done anyway, because most anti-aliasing is usually at least somewhat hardware-aware. In my opinion, regardless of how subpixels are laid out, more resolution is always better.
Today it seems it's hidden as a dconf option:
$ dconf read /org/gnome/desktop/interface/font-antialiasing
'rgba'
But this is an issue that applies to VA panels as well (cheaper than IPS, worse viewing angles, but better contrast ratio), and I have a 27" 4k VA screen that works just fine with it turned on in Linux — text is so much clearer with it on than off, and attaching a MacBook to a 4k screen at 27" or 32" IPS makes me hate MacOS for killing subpixel rendering off.As for "retina" resolutions, I've tried 24" at 4K as soon as it came out (with that Dell monitor that required 2 DP 1.1 connections for 60Hz IIRC), and turning subpixel rendering off made text and lines jagged — that was ~190 ppi at a normal viewing distance with vision corrected to better than 20/20 (which is what I usually have — can't really work without glasses anyway, and worse correction leaves me with headaches). For the record, 5k at 27" and 6k at 32" is roughly ~216 ppi, so not much better than ~190 ppi: subpixel rendering probably achieves 2x the increase in text clarity for those not sensitive to colour fringing (I am not).
So, subpixel rendering is really not an issue on any displays, but Apple will happily tell you what's the limit of your vision and upsell you on their monitors.
Haven't tried Apple's big "retina" screens but considering they are ~215 ppi, pretty confident 10% increase in PPI wouldn't make a difference that subpixel rendering does. Laptop screens have higher resolution, but haven't really paid attention to whether M1 Air 13" or 4K 14" X1 Carbon work for me without subpixel rendering (I prefer to be docked).
Before anyone jumps on "you've got incredible vision": I wear either glasses or contacts, and with that my vision corrects to better than 20/20 — slightly lower correction induces headaches for me. Without glasses, I'd probably be happy with 640x480 on 32" so they are kind of a must. :)
My apologies for buying 1080p monitors that had no issues with neither my Linux, nor my Windows computers, I guess. I can understand that they might not care about what I care about (supporting the hardware that I have, rather than going out of my way to buy a new monitor just because of a new computer deciding not to work with it well), I'd argue that maybe that's even fine because it's their device and ecosystem, but jeez, that tone is super uncalled for.
As an aside, I use the M1 MacBook at a scaled resolution of 1440x900 because anything finer is hard for me to see. That's a visible PPI of around ~130 because of the 13.3 inch screen. A 1080p monitor of 21.5 inch diagonal size would have a physical PPI of around ~100, so that's around 80% of the pixel density. That's not to say that the panel on the MacBook is not much nicer, but rather that with software anti-aliasing it could definitely be okay. Somehow I don't want to buy a new monitor just for the weekends when I visit the countryside.
I have a perfectly good normie dpi 25x16 display which is extra crisp on windows. On macOS I had to install betterdisplay just to make it not miserably bad; it’s just plain bad now. As far as I can tell Apple removed the feature because of greed and laziness.
I care about how things look, and have spent more time than I want to admit configuring MacOS apps to look good on the screens available to me. I just don’t care enough to buy an expensive office screen with my own cash if my employer can’t provide one.
That statement has no connection to the premise.
There are multiple reasons to use an old screen besides the mentioned reason of not caring for x.
Windows is historically very good on Low-DPI but they also managed to be great on HiDPI.
Linux, well, it depends on so much things … you can achieve good on both but you’d better be ok with integer scaling and not have multiple displays with different DPI.
Then Mojave switched to Metal and removed subpixel AA, and now it’s the worst.
Thread from when it happened: https://news.ycombinator.com/item?id=17476873
No it hasn't.
Maybe to you "always" means "since 2014" but if so that means you are very young and you should not generalise from that.
I've been using Macs since 1988 and Mac OS X since 2001 and it used to be great on SD screens. I used to use Safari on Windows XP because its font rendering was so much better than the built-in Windows Truetype renderer.
This change is new and recent.
It is absolutely not "always".
xrandr --scale, xorg.conf, or nvidia-settings GUI and save to xorg config
This helped me a lot, I was about to ditch my external monitor because of blurriness.