I have a 4 year old laptop that cost me $3000. Xcode is so laggy sometimes that if I type, it'll miss keystrokes and words come out garbled beyond recognition. A few weeks ago some source code I downloaded from github wouldn't compile because the swift compiler spent too long doing type inference, and decided to error out.
I'm haunted by a vision of computing. In the vision, us software folks just add bloat to everything until it starts feeling gluggly and slow on our modern expensive computers. Then we optimize our programs just enough to keep them running vaguely ok on whatever hardware is on our desks.
The only way to have a snappy computer running modern software is via the tireless work of hardware engineers. Whenever a big hardware performance improvement comes (like the M1), there's a window of a year or so where if you upgrade, everything will run fast again. And of course, all the devs with the new machines stop optimizing their programs and gradually everything slows down again. Eventually our software goes back to being as slow as it was before, except if you didn't upgrade your computer, now it barely functions.
I want off this ride. My 6 year old FreeBSD server is somehow still snappy and responsive. Maybe the answer is to just not run modern desktop software.
I maintain today you could've gotten %80-%90 of the benefits of swift with a syntax refresh and add a few new cheap to compile features to Objective-C itself, like stronger nullbables and the ADT enums.
I think the 'good enough software' stuff will stop happening when physical limits start limiting improvements in compute, because the only way to have a competitive advantage then is through better software.
I hope that day is far away although, because it means that something like 8k consumer VR is something that will never happen, where the software is written properly to take proper advantage of the hardware.
We're having this conversation in the context of notion. Starting up Notion on my laptop feels like wading through honey. I really want to love notion, but every time I open it and try to do anything I find my motivation tricking away. Notion isn't even doing anything impressive with all that compute - its just slow. The sales pitch I make in my head for notion is that its a place to organize all my notes and workflows, but like Jira, living in notion means accepting that I work at the speed notion does, and that feels antithetical to the feeling of productivity I have in snappier, lighter tools like Bear and iA writer.
But Notion and XCode aren't really the problem. The problem is cultural. It seems to be the worst in the web ecosystem, though its not confined there. My favorite example of all is this issue in web3.js[1]. I can't help but play Benny Hill music in my head while I scroll through this 3 year old, apparently unsolvable issue thread.
When the new M1 macs came out lots of people were complaining that there's no 32 gig variant. Holy cow that is so many bytes. The Atari 2600 had 128 bytes of ram. The NES ran super mario bros in 2kb. Its completely ridiculous to me that our software can even fill 16gb doing everyday computing. Is there a ceiling? Will there ever be a ceiling, or should we expect a document editor in 2030 to sit on 256gb of ram, just because thats how much ram modern javascript frameworks use by then?
[1] https://github.com/ethereum/web3.js/issues/1178#issuecomment...