The "retina" iMac 27" (5K display, ~218 PPI) came out in 2014.
I've been using this resolution since 2016. 32" 8K seems like the next logical step but it's disappointingly expensive/unavailable.
Personally I'm holding out for UHBR20
Edit: Also there's this one from 2017 https://www.dell.com/en-us/shop/dell-ultrasharp-32-8k-monito...
Apple and Microsoft both make high DPI displays over 260 ppi on their largest 15-inch+ notebooks and tablets.
On: https://blog.aktsbot.in/img/stem-darkening-on.png
Personally, I never had issues with fonts on any OS that much except when I connected my M1 MacBook to a 1080p monitor, then it felt like the fonts had no anti-aliasing at all.
I’ve seen Linux render fonts like that, but not recently. I’d probably look into debugging my setup if that’s what my desktop looked like.
PS: talking about screenshots in the article.
$ cat $HOME/.config/fontconfig/fonts.conf
<match target="font">
<edit name="rgba" mode="assign">
<const>none</const>
</edit>
<test qual="any" name="size" compare="more">
<double>1</double>
</test>
<test qual="any" name="size" compare="less">
<double>22</double>
</test>
<edit name="antialias" mode="assign">
<bool>false</bool>
</edit>
<edit name="hintstyle" mode="assign">
<const>hintfull</const>
</edit>
</match>Usually when people complain about this, the comparison is to Windows, which prefers strong hinting, i.e. snapping the font shapes to within (sub)pixel boundaries.
There also used to be patent encumbrance issues around subpixel rendering, making Linux font rendering on TFTs overall worse by default, but as of 2019 those have expired. Some distributions had already enabled those features anyway.
Under a fair comparison on a 1080p display with no scaling even Windows demolishes MacOS these days. Apple dropped support for subpixel AA years ago which really hurts on standard DPI rendering.
For example: https://i.redd.it/f61p6sa34cta1.png
Or: https://forums.macrumors.com/attachments/screen-shot-2022-10...
I dont think windows is such clear winner here. Seems like different philosophies.
The first image has less hinting, causing the top of the x-height to be rendered at a half pixel. This makes it seem less intense that the rest of the letters. The second image aligns way better and have a more consistent intensity and sharp edges, but gives an overall slightly more bold appearance, and also compromises a tiny bit on the actual shape of letters.
On the other... Wouldn't this lead to terrible choices in font selection for anyone NOT using these settings?
Seems like full hinting + RGB antialiasing is the way to go on non 4K displays.
No more blurry fonts in Linux - https://news.ycombinator.com/item?id=39640329 - March 2024 (2 comments)
No more blurry fonts in Linux - https://news.ycombinator.com/item?id=38853588 - Jan 2024 (1 comment)
Sorry about that rant.
Every discussion of font rendering technology must include a statement to the effect of "Acorn RISC OS fonts from 1990 have not been bettered". :-)
On Linux font selection is terrible, people have problems with Wayland/x11 rendering and other settings (often opinionated defaults from distros). But when you are lucky :)) you can get pretty much same hidpi font rendering.
I use non-HiDPI 1920-pixel-wide displays at scale factors between 1.25 and 2.0. (Yes, I like big type and big UI elements: my eyesight is bad.)
On Linux, good luck using anything else than integer scaling. And it’s a shame because with a 4K screen and fractional scaling, you can get both more definition AND more real estate.
That has little to do with it. Apple has had vastly superior font rendering since the day OSX launched, and have been in first place ever since. There's no point in my memory of the past 20+ years this was not the case regardless of display technology.
Even though other systems implement a lot of the same techniques of sup-pixel rendering, fractional scaling, etc. the results speak for themselves. With the same monitors, the same display cables, the same fonts, and the same applications, text rendered by Apple operating systems is more crisp, clean, readable, and pleasing to the eye. Configuring settings on Linux improves the situation, but no configuration gets close to the quality Apple delivers by default. Apple is also superior to Windows in this regard, but the margin is much smaller.
This is coming from a person who maintains a long list of MacOS gripes because of all the things they do wrong. Font rendering is one of the few things they have consistently done better than everyone else.
Use a 32-inch 4K display and it will have be blurry/thick fonts too.
It looks night and day better than my 2x4K 27" I use with my Windows 11 work laptop, even with ClearType (That was a let down. I replaced 2x1080p 21" displays and expected that to fix the woeful fonts on Windows).
About the font choices for sure Mac has better selection than open source typefaces… but you know you can take your San Francisco and install it on Linux if you want.