I.e. Slack and Atom got absolutely lambasted for performance, sluggishness and resource use (while VS Code was applauded, so it's clearly not an Electron specific thing) despite being made by companies valued in billions and based in the most expensive region of the world, one of them even being a paid product.
Or a game with pixel art (I do like it and I understand that particular indie dev optimizing for time with such a niche product so I don't want to name names here) graphic and gameplay only as deep as some better Flash ones from mid 2000s requires as its minimal system requirements several GBs of RAM (for comparison, Doom 3 recommended, not even minimal, was 512 MB in 2003) and disk space, etc.
Or when a graphically simple 2D game requires a 64 bit OS (despite using no 64 bit features seemingly), a non-integrated GPU (and not because of some lack of OpenGL features but due to poor optimization) and runs at 30 FPS on an integrated Intel that has 0 problems with Mincraft with really far draw distance. And it attempts to load hundreds of files (all of the game assets for an entire 4-10 hour long VN) at boot, taking 30 seconds on an HDD. And they could be loaded incrementally (loading what is needed right now only and everything else in the background, even dumbly and fully into RAM as it does now) or packed into SQLite or a ZIP to avoid so much FS access, but no - hundreds of files are being opened at game boot and there are tons of XML assets with 0 compression or minization. But instead the solution to performance woes (in gaming especially but through things like Electron it's seeping into main stream) is apparently to "git gud", "stop being a poor pleb" and getting a new GPU (apparently GTX 950 M is a potato level GPU now and only an idiot would play games on it in 2017) or an SSD so that the developer doesn't have to bother to do the tiniest of optimization.
That 2D game loading all assets, wanting a 64 bit CPU and non-integrated GPU, all for no good reasons, was Tokyo Dark by the way and due to the way the developer carry themselves I have 0 problem name dropping them, I made an entire video about that game, the disk and GPU part is at 15:15 : https://www.youtube.com/watch?v=sCXwgPJGLIE
It feels like what was done with Crash Bandicoot is some interstellar death star level technology in comparison to what some developers do, not even bothering to pack files to reduce FS chatter or load smartly or compress textual assets, they probably had it developed on an SSD, it loaded fast enough for them, it's done and prime for shipping, duh! Just gotta write a hype text about how extensively we tested it and how much effort we put in making it!
I realize I sound like an ass that's ranting and I am writing too lengthy (I did think about writing articles instead of lengthy HN comments like this one so if someone is interested feel free in hitting me up) but some of the stuff just blows my mind in ways I didn't know existed.
It's not even optimization for dev time like Python could feasibly be but sometimes outright waste or lack of basic care, i.e. Slack was apparently launching a full blown browser per organization until recently (or something like that), completely needlessly, now that part is out. At the same time they had this crazy involved (and cute, because it's 2017 and things must be cute) error page: https://slack.com/asdsad , or that semi-notorious reply article from a guy using unix CLI instead of hip BigData(tm) tools to analyze relatively small amount of data (yes, the guy is rubbing it a bit in too badly when he brings out mawk): https://aadrake.com/command-line-tools-can-be-235x-faster-th...
That lack of care is evident in other areas too, i.e. in security it manifests as these SQL injections, IoT botnets, outdated software pwns and plaintext/unsalted+sha1 password debacles. Afterwards it gets justified by "state attack, China or Russia probably" or handwaved like "we store passwords in plaintext to send them to user via email when he forgets them" (an actual explanation I read once..) or "we innovated so fast to deliver SUPERB customer experience that we didn't focus on security" (while 'security' in that case would amount to closing an admin port on an IoT appliance for example..). In general software we get also stuff like that TP-Link repeater (recently on HN) that needlessly queries NTP every 5 seconds, squandering hundreds of megs of transfer per month and basically DDoSing these NTP servers.
It's like this entire mentality that good stuff is too hard or too complicated or too expensive to do (like that Chess guy and his "clever multi-threaded application") while Pareto is very much in effect and even as little as not opening a hundred files at once at game boot or reading the dense man/info pages and thinking for 20 minutes about the problem at hand or back of the napkin math could make a big difference. 10 or 20 minutes or hours of dev time per year is not a big enough reason to squander resources so badly. There is an expression in Polish that seems really apt for developers who "optimize" their time to that degree: korona ci z głowy nie spadnie (the crown won't fall off your head, basically meaning something along the lines that exerting a little effort towards something isn't too much to be reasonably asked/expected of you).
I recall a similar event when someone wanted to stress test something on a webserver and had a few million long file with URLs in it, he did while read line curl $line in bash, it brought his local machine to its knees, probably due to this rapid process creation and destruction. I gave him an xargs with -P and -n to launch a single curl per each 100 URLs instead and it ran no problem and this time the webserver we were testing was on its knees on my much weaker laptop (weakest in the company actually, since I wasn't a programmer and didn't need a strong one), as intended. I'm actually guilty of overengineering myself, since my first try was a Python 3 + requests + grequests script, and only when weeks after I forgot where I put the script and didn't want to rewrite it I ran that xargs version (very Taco Bell eqsue solution actually - https://news.ycombinator.com/item?id=10829512 ).. And that's an anecdote but it feels like people (actual 'professionals' making a paid product and working in $billion+ corps) ship stuff as bad as the original 1 curl per URL script as if it's not a big deal and then it gets justified with some handwaving, "focus on features and not performance and security", "no one is gonna hack a toaster for anything", "computers are fast and cheap", "optimizing for dev time", etc.
It's a typical high volume low margin situation, like Steve Jobs once said during original Mac building that improving a load time by even a few seconds saves lives of people because so many people will use the Mac so often that it will add to a few lifetimes.