And generally this doom-saying by people like Jonathan Blow doesn't reasonate with me at all. Sure there is a lot of waste. But also software is better then ever was. 3d graphics are better, programming langauges, Unicode support, software development process, hardware is faster, all the clunky C tools are being replaced with modern equivalents in Rust/Go with way better usability, cryptography and security are better, desktop environemnts are better, file systems are better. I could go on.
In my mind the only things that are getting worse are the commercial stuff: web sucks and stole the thunder from native apps and Windows/MacOS are degrading.
It depends on what you're measuring and what you consider "better". In many ways, it is better. In many other ways, it's worse. Which you think it is depends on what you value.
Sure, but more so - what software you use. AFAICT most complaining is about Windows software and web.
Windows is getting crappier and crappier because it ceased to be an important part of Microsoft's empire, and web... is web, went full stupid on JS and attracts most newcomers.
On the Linux and especially CLI side everything seems to be flourishing.
Well, a number of people have gone to prison for software piracy, which would seem to contradict your claim
If our appliances would break on average once in a hundred years?
If our tools were so durable it could survive generations?
If our furniture and our building materials could be reused almost endlessly?
Would the economy shrink? Would the manufacturers go out of business? Or would all the money saved go towards other, better things?
Would our economy shrink? Probably, since our economy is based on ever-expanding growth. The better question, in my view, is "would that be a bad thing?"
What if a business actually takes a long-term view: investing in standards and fostering it's ecosystem instead of trying to outmaneuver competitors using any short-term tricks available? What if a company makes a great dishwasher and only change it when they can improve it? Will they inevitably be driven into extinction or bought up by more short-term profit-hungry enterprises? Maybe... but is that really inevitable?
Apparently so, e.g., "The Instant Pot" (https://archive.is/DKBzB)
This quote is the foundation of the blog post and unfortunately misses the point entirely.
Yeah, I agree with all these arguments against using LLMs for software development. That's precisely why they aren't going to be used for that.
Nobody wants to increase the cost of real development. Everyone would rather skip having to write code at all to get a result. From a high level perspective there's no difference between an LLM producing junk results and a team of humans writing bugs.
Though when humans get it right they really get it right and the product is worth much more due to its maintainability built off a good release. This is the missing insight from the blog post. Many businesses will still have to pay for real software dev and those days are not numbered.
LLMs really do grow the market legitimately. Low quality software is like store brand canned goods. It's a high upfront cost for the factory, but the product can scale on other merits.
These products are not mutually exclusive. You can get rid of your call center with LLMs including all the software the agents needed to service an account, but you still want humans writing the critical infrastructure and all the high visibility frontend stuff because you care about the data integrity and good UI/UX. This is what most devs already do all day. There's no threat here.
Now, we can say with confidence that some inefficiencies don't work and are just inducements for everyday waste and corruption. But the majority of those aren't hyped technologies, they are longstanding norms. The hype and investment is, rather, a hope that the new problem saves us from the old problem in some respect - that, you know, maybe you can use all this new stuff to manufacture your own dishwasher, exactly how you want it, and it doesn't need a middleman company to make it "smart".
If you want to write software that is "nice" in the sense of longevity, you have to target archival materials, which isn't within computing's mainstream, but is a major aspect of retrocomputing: certain elements have proven to last, and within their context, you can just pick them up and use them again and again, for the rest of your life. You can reasonably hope to target VGA, PS/2 keyboards, 6502 processors, etc.
If you leave the software open to using new I/O, new protocols, you are engaging in speculation. Speculation needs financial motivations to fill in gaps: you can't get everywhere you want to go by working on the fun parts. You build a company because you are trying to fill in a gap. It is not pretty and sometimes you break some hearts, minds and bodies by being the person doing that, but it's quite a bit better than an armed struggle.
but you're right, opinions you don't agree with are dangerous.
I reflect similarly in my own observations in general. We often foresee the future as a reflection of 'here and now' versus understanding the long-term implications of decisions made at vast scale.
There are many great examples right in plain sight in every day life.