With helpful conclusion I don't mean just stating the facts or comparing transistors or input latency with network latency. As if developers stopped caring and created crappy software on purpose.
The post compares an Apple 2e, which is a single-tasking OS that just displays the pressed key in the basic interpreter on the screen, and modern devices, where it is not always clear, what kind of app or setup is being used. But we know, that it's plenty of layers of GUI and OS code, that most people don't want to miss. Not to mention, that mot of the higher input lag is not detectable by humans in normal work conditions.
Yes, there were years, were CPU performance couldn't keep up with added features, like immediate spell checking. I used computers through all those years and know this first-hand.
And I never dismissed the importance of input lag. I pointed out the oversimplification to support the main argument of the linked post, which suffers from this as a result.