I am claiming that people keep making this claim and it no longer holds true because software is already losing too much performance for the value we get back.
That's my whole thesis.
Impacting networking affects the entire machine, especially in so far as a computer is increasingly just a dumb terminal to something else.
Look, If you make network requests potentially 20% slower then the browser performance will be impacted too, it's so obvious that I'm not sure how I can explain it simpler.
By how much? I am not sure, but you can't say it won't be slower at all unless we're talking about magic.
Pretending that it's trivial amounts of performance drop without evidence is the wrong approach. Show me how you can have similar performance with 20% increase in latency and I will change my stance here.
As it stands there are two things I know to be true:
Browsers rely on networking (as do many things, btw) and software is increasingly slow to provide similar value these days.
Grandma doesn't care if her tablet can't saturate a WiFi 6 link. Grandma doesn't care if her bank's web page takes an extra 75µs to traverse the user-land network stack. But she will care a whole lot if her savings are emptied while managing her bank account through her tablet. Even worse if her only fault was having her tablet powered on when the smart toaster of a neighbor compromised it because of a remotely exploitable vulnerability in her tablet's WiFi stack.
Or are you suggesting that grandma should've known better than to let her tablet outside of a Faraday cage?
> Pretending that it's trivial amounts of performance drop without evidence is the wrong approach.
Amdahl's law begs to differ. If it takes 5s for the web site to arrive from the bank's server, spending 5µs or 500µs in the network stack is completely irrelevant to grandma. Upgrading her cable internet to fiber to cut these 5s down to 500ms will have much more positive impact to her user experience than optimizing the crap out of her tablet's network stack from 5µs down to 1µs.