It sure seems like modern hardware's speed is wasted on making things slightly easier for devs, rather than speed or performance increases for users huh. If 29 layers of abstraction saves a little dev time over 5 layers, that's fine, but it feels like it creates this unnecessary requirement to use <5y old hardware to run things smoothly
From electron instances everywhere, to the niche suckless philosophy, so much of software seems to be programmer-first. Truly missing the forest for the trees.