99.99999% of software in big corporations will not have even 5% of the quality of the Linux kernel. The reason the kernel has such a high quality is because of Linux being a dictator, training everybody in not always so nice ways to write code that doesn't break anything and that is maintainable.
He cares about the code and he has the status and the mandate to prevent it from becoming shit.
Applications have gotten better too. Browsers have gotten significantly more robust against misbehaving pages. Microsoft Office doesn't eat my work (even if I forget to hit ctrl-s).
In fact, I would go so far as to say that the only software which has gotten worse is video games. Used to be you could put in a disc, install the game, and be reasonably assured that you were getting a playable final product. Today, you put in the disc, install the game, and then have to download multiple gigabytes of patches... and the game is still often buggy (Bethesda, I'm looking at you!).
>99.99999% of software in big corporations will not have even 5% of the quality of the Linux kernel.
That is true. But it was equally true when Linus Torvalds dropped the first version of the Linux kernel all the way back in '91. It's not clear to me that things have gotten worse since then.
For the longest time, AAA studios mostly released simple first person shooters with straightforward enemy AI and simple physics. And Bethesda and Obsidian released gloriously buggy RPGs. Nowadays, every game includes open world elements, RPG elements, and more complicated NPC interactions... and it turns out that they are all full of bugs. Complex games have complex problems that don't reveal themselves until players do weird things.
Not to mention all these RPG systems add a whole additional layer to mess up -- character stats might not be 'buggy' exactly, but they might be very poorly 'balanced.' It is really easy to not explore every skill interaction and sometimes multipliers end up exploding. I mean, we saw Blizzard fail to balance Diablo II for like a decade or so, Wizards of the Coast tries to balance D&D but that takes all the fun out of character building -- RPGs of any significant complexity are I think just fundamentally prone to exploding numbers.
Having done a lot of work on the kernel, I think you’d be surprised at the relatively low quality of a lot of kernel drivers.
The core code is generally quite good, but it’s not true that the kernel is full of pristine code.
I’ve worked at multiple companies where code quality standards exceeded a lot of the weirder stuff I’ve seen (and often fixed) in kernel driver code.
The kernel, unlike many other understaffed open source projects, has lots of developers working on it (the majority paid these days). At the same time, unlike software built by corporations, there are no non techinical PHBs to say "ship it now and clean it up later".
Linux has found the sweet spot of getting companies on board to provide paid developers (it's much easier to do good work when you can allocate large blocks of time to it because your're paid to do it) whilst at the same time preventing them having too much say over the technical directions and timelines. Typically a company employing kernel developers will have a say in what they do (drivers, core kernel, ...) but not on how they do it.
But it's not really a model that can work for all open source software.
For related reasons I think that companies that build lots of software but whose product isn't actually software (like Google that is really an advertising company) often produce better quality software than pure software companies (like Microsoft) because the technical staff are left more alone by management who prefer to concentrate on the "real business"
My earliest kernel I have used in something one might call production are 1.2.12 around 1995. I must say even then, with this early kernels I had no panics at all and much higher uptimes (patching for security wasn't as much of an issue at that time ;-) )