Windows is maintained by morons, and gets shitter every year.
Linux is still written by a couple of people.
Once people like that die, nobody will know how to write operating systems. I certainly couldn’t remake Linux. There’s no way anyone born after 2000 could, their brains are mush.
All software is just shit piled on top of shit. Backends in JavaScript, interfaces which use an entire web browser behind the scenes…
Eventually you’ll have lead engineers at Apple who don’t know what computers really are anymore, but just keep trying to slop more JavaScript in layer 15 of their OS.
I think I did ok. Would I compare myself to the greats? No. But plenty of my coworkers stacked up to the best who'd ever worked at the company.
Do I think MS has given up on pure technical excellence? Yes, they used to be one of the hardest tech companies to get a job at, with one of the most grueling interview gauntlets and an incredibly high rejection rate. But they were also one of only a handful of companies even trying to solve hard problems, and every engineer there was working on those hard problems.
Now they need a lot of engineers to just keep services working. Debugging assembly isn't a daily part of the average engineer's day to day anymore.
There are still pockets solving hard problems, but it isn't a near universal anymore.
Google is arguably the same way, they used to only hire PhDs from top tier schools. I didn't even bother applying when I graduated because they weren't going to give a bachelor degree graduate from a state school a call back.
All that said, Google has plenty of OS engineers. Microsoft has people who know how to debug ACPI tables. The problem of those companies don't necessarily value those employees as much anymore.
> I certainly couldn’t remake Linux
Go to the os dev wiki. Try to make your own small OS. You might surprise yourself.
I sure as hell surprised myself when Microsoft put me on a team in charge of designing a new embedded runtime.
Stare at the wall looking scared for a few days then get over it and make something amazing.
I was there in the DOS days. I was there when Windows 3.1 came out (others too but I didn't use them). I was there when Windows 95 came out.
Microsoft has never been about "pure technical excellence". We had wonderful machines (Unx ones and then stuff like the Atari ST / Commodore Amiga / Archimedes) and amazing OSes (including Unx on workstations) and Microsoft nearly destroyed everything with the endless turds it produced that ran on cheap beige PCs. Not excellence. Mediocrity. Cheap, but mediocre.
At some point 95% of all machines sold with an OS had Windows and the times were incredibly dark. Thankfully things changed and now Windows is only present on something like 11% of all devices sold yearly that have an OS.
We dodged a big one and many of us shall never ever forget how slow, insecure, horrible and mediocre the products of that company were.
Microsoft's goal was to make machines everyone could afford. Their mission statement was a desktop in every home and they pulled it off.
They didn't pull it off by making an OS that needed a boat load custom chips (Amiga), or that required a huge beefy system to run (OS2, Unix).
They did it by making compromises that kept costs down and made computers accessible. They pushed for multimedia standards when the technology was appropriately matured, and their consumer OSes evolved in maturity as Moore's law progressed. Even then everyone complained about "ever growing" system requirements, especially when the move to XP happened, and then again when Vista came out with its improved security model.
Those fancy slick Sun OS boxes cost a fortune compared to a Windows box of the same time. Sure the Windows box crashed, but as a kid growing up in a working poor family in the 90s I was able to afford Microsoft's imperfect OS, because they had purposefully built an entire ecosystem that was designed to be affordable.
Microsoft pitted every PC OEM against each other in a race to the bottom, until margins approached and then fell below 0 for a new PC.
I've used tons of different systems. Thanks to the efforts of Valve desktop Linux is now usable, but it still has a thousand stupid bugs, many of which I wouldn't have tolerated on Windows 20 years ago. MacOS is very black box-ish and despite daily driving it for 6 or so years now the machine doesn't feel like it is "mine" in the same way a Windows 7 or Windows 2000 machine did. The old 16bit graphic power house machines were sexy but those custom chips don't age well and there was no way they could compete with an open standard like the PC.
Perfect is the enemy of the good. Microsoft very much made software that was good enough, but the truth is good enough is also admirable. Good enough is affordable, it is quick to market, it is adaptable, it is usable by the masses in a way that perfect isn't.
This is certainly false. There are plenty of young people that are incredibly talented. I worked with some of them. And you can probably name some from the open source projects you follow.
There's also some specific measurable "turning to mush" going on, like reduced literacy rates, and lowering of IQs (slowing/reversing of the Flynn effect)
In fact today on GitHub alone you can find hobbyist OSs that are far far more advanced what Linuses little weekend turd ever was originally.
Their success is not gated by technical aspects.
I know this forum is highly skewed towards Saas/JS/web stuff, but there's an entire industry of deep tech software and the payouts are excellent.
How is that? It's easily the software project with the largest number of contributors ever (I don't know if it's true, but it could be true).
Rent-seeking and Promo-seeking is the only motivation for the people with the power.
None of that class wants to make a better product, or make life better or easier for the people.
But the lazy (and wrong) belief by people not committed to exacting standards in their engineering, that AI is just another layer of abstraction or another scripting language, actually obscures a much more unpleasant fact: Performance, as far as the managerial class was concerned, was never about getting the best performance. It was always about whatever was just enough.
We as coders used to prioritize performance because hardware was so limited and we wanted to squeeze the most out of every cycle, every 1Kb of RAM. For some of us, that habit will never die, because we look at a new piece of hardware and realize how much more we can make it do.
But pre-AI slop of backends with huge supply chains and Electron as a frontend arose because memory and compute had become so cheap that acceptable performance required less and less optimization.
That doesn't mean that some of us didn't maintain a niche in making things optimized, but for the past twenty years or so there's been a whole generation of engineers whose priority has been speed of development. And from the perspective of a company that treats engineers as disposable cogs and prioritizes frameworks and assumes Moore's Law, why not?
AI just takes that to the next level. Take the entire chain of existing React slop and create a Markov chain to regirgitate parts of it on cue. And let's be honest: 95% of companies don't need to forge anything particularly new, they just need to cobble existing parts together.
I thought it about 15 years ago talking to CRUD coders who hated their jobs: You're in the wrong business if you're not getting joy out of creating and solving new problems. So in a way, AI just gives everyone who only wanted shitty software the shitty software they deserve? I don't know. I haven't finished thinking about it.