What really sounds like something with the setup is not right to me is that Beached seems to say they have to cross their fingers every time they run just a plain update, not a release upgrade.
I remember often having to fix something when dist-upgrading Ubuntu when I used it on personal desktops between ~2005 and ~2014. I can't remember what kinds of issues there were, and apparently none of them entirely bricked the systems beyond repair since the same install survived throughout those years, but I remember it wasn't flawless.
But random breakage from standard updates within release is not something I remember happening on Debian, Ubuntu or modern Fedora with any frequency. (Except when I ran Debian unstable before Ubuntu, but that's a different story.) Double digit percentages, or even integer ones, do indeed sound like they are from a different planet.
I switched away from Ubuntu on the desktop over some dissatisfactions nearly a decade ago (and I never really ran Kubuntu) so I don't have much experience with that from recent years. But I've now had the same install of Fedora since 2014 that's gone through more release upgrades than I care to count, and while release upgrades still have a non-zero chance of breaking something, standard updates have been pretty much flawless.
I suspect there's something rather different about the setups of people who report no issues and that of someone who says standard updates frequently break in mainstream distros. Either there's a hardware difference or a difference in some kinds of 5% needs for the setup -- such as some specific software, specific peripherals, or specific needs. E.g. someone doing sound production might be much more likely to run into sound issues than someone whose sound needs are Netflix on USB headphones.
My last couple of personal devices have been ThinkPads that I picked in part specifically for the compatibility. The previous one with almost everything Intel worked pretty much 100% issue-free for years. I can imagine that if I instead had a random-brand consumer laptop or desktop, a wifi card with some kind of flaky half-support, proprietary NVidia drivers, some slightly niche peripherals, and the need to run some kind of a proprietary application dynamically linking with shared libraries from 2017, the story might be quite different.
Of course it might also be that some of us are just so used to dealing with fixing small issues or skirting around them that they just don't register for us any more. I think that's something that might also happen for some people -- tinkering is so natural that it doesn't feel like tinkering so tinkering doesn't exist -- and I wonder if that's part of the difference in the reported experiences.
But if someone reports a double-digit percentage of stuff breaking on a standard update, there's something else going on as well.