Not what I said. And I think you are the one that's doing historical revisionism now.
Even in this email from 1983, it starts off with
> since its 68000 microprocessor was effectively 10 times faster than an Apple II
From the 80s through the 00s (which I was alive through and very aware of), computer hardware was frequently doubling in performance. The common wisdom then was to make things fast enough. Anything more was a waste of time because in a year or two hardware would be twice as fast.
The wastefulness of today came directly from that past wisdom. I can guarantee you that ever since I've been conscious around discussions about software there's been people that have bemoaned how sloppy and wasteful software has become. People complained about how bloated Windows XP was vs 98.
Ruby, python, perl, java. All these bloated and slow programming languages got their starts in the 80s and 90s. Exactly because of the wisdom that "it's slow today but hardware tomorrow will make it fast". Heck, even C and lisp are manifestations of this. Consider that people weren't writing all software in assembly during the time period in question. There were clear performance benefits of doing so as compiler at the time were particularly bad.
I've worked with a lot of older devs and they all hold the attitude that performance optimization is a complete waste of time. They've been the hardest ones to break of that notion. Younger devs tend to more intuitively know that performance optimization are important. That's because over the last decade, hardware performance improvements have stagnated.
So yes, absolutely yes. In the past if you could make writing software more ergonomic by sacrificing some memory or performance, that's a tradeoff most of the industry would gladly take. They wrote for today's hardware and sometimes tomorrow's.