> What killed that balance wasn't raw speed, it was cheap RAM. Once you could throw gigabytes at a problem, the incentive to write tight code disappeared. Electron exists because memory is effectively free.
I dunno if it was cheap RAM or just developer convenience. In one of my recent comments on HN (https://news.ycombinator.com/item?id=46986999) I pointed out the performance difference in my 2001 desktop between a `ls` program written in Java at the time and the one that came with the distro.
Had processor speeds not increased at that time, Java would have been relegated to history, along with a lot of other languages that became mainstream and popular (Ruby, C#, Python)[1]. There was simply no way that companies would continue spending 6 - 8 times more on hardware for a specific workload.
C++ would have been the enterprise language solution (a new sort of hell!) and languages like Go (Native code with a GC) would have been created sooner.
In 1998-2005, computer speeds were increasing so fast there was no incentive to develop new languages. All you had to do was wait a few months for a program to run faster!
What we did was trade-off efficiency for developer velocity, and it was a good trade at the time. Since around 2010 performance increases have been dropping, and when faced with stagnant increases in hardware performance, new languages were created to address that (Rust, Zig, Go, Nim, etc).
-------------------------------
[1] It took two decades of constant work for those high-dev-velocity languages to reach some sort of acceptable performance. Some of them are still orders of magnitude slower.