In fairness to them, a lot of programmers didn't come up the way (we presumably) did - if you started using computers/programming in the 80's and building computers in the 90's your worldview is going to be
fundamentally different to someone who started in 2018.
We came from a world where bytes mattered they come from a world where Gigabytes matter.
In some ways caring about that stuff can be detrimental, at the back of my mind there is always that little niggle - you could do this in 1/10th the runtime/memory cost but it'll take twice as long to write and you'll be the only one who understands it.
These days we don't optimise for the machine but instead for the human time and honestly, that's an acceptable trade off in many (but not all) cases.
It can be frustrating when you remember how much of an upgrade getting a 286 was over what you had, that I now routinely throw thousands of those (in equivalence) at a problem inefficiently and still get it done in under a second.