macOS doesn't even render to its own high pixel density displays correctly, owing to the (in my opinion) very naïve algorithm used. If you select any resolution that's not a perfect factor of the display being rendered to, then there is blurriness[1]. MacOS renders to a viewport that is 2× the resolution of the 'looks like' setting, and then scales it down to the actual monitor resolution. Clearly, at any non-integer multiple resolution, there is blurring.
This is problematic enough that it defeats Apple's 'good font rendering'. I see shimmering and ringing artifacts around regions of high contrast (i.e. essentially all text) with such a non-native setup. I am forced to use the integer factor resolution, which makes things much too big. Of course, I can scale my browser and VS Code, but besides that the rest of the OS is comically large. Needless to say this also comes with the large performance impact of always rendering to a viewport four times the resolution of a given display. It is also non-intuitive to program against, especially using APIs like GLUT, SDL, etc.
Windows is the only OS that actually does high pixel density rendering correctly for programs that support it[2]. Windows works with the given monitor resolution, and scales UI elements according to the percentage value set (100% is 96 DPI). This is a lot more involved to program for, but when done right, it works exceptionally well. Everything that's not a raster image is always pixel-perfect. If it's not (and people have complained about this[3]), then there's a system setting/registry patch to make it so[4].
Windows also handles moving program windows between displays set to different DPIs quite seamlessly. The only issue I see is when a new display with a different scaling setting is set as the primary (and only) display, and then Windows Explorer scales things weirdly—which is fixed by restarting Explorer.
On Linux... Forget it. On Xorg there are a million environment and per-app-specific configurations to set (just see how long the HiDPI article[5] in the Arch Linux wiki is). On Wayland, things are better, but not yet for me, since I use an NVIDIA graphics card, KDE Plasma, and Chrome, which is the worst possible combination for Wayland. It's not mature enough for this setup—the Windows-esque rendering (they call it 'fractional scaling') was only merged in slightly more than a year ago[6], and Plasma 5, my DE of choice, still doesn't quite use it yet.
[1]:
[2]: https://building.enlyze.com/posts/writing-win32-apps-like-it...
[3]: https://news.ycombinator.com/item?id=38444967
[4]: https://serverfault.com/questions/570785/how-can-i-make-micr...
[5]: https://wiki.archlinux.org/title/HiDPI
[6]: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/m...