There has been a lot written about this, but here's a test I just ran on a random page on Huffington Post:
https://www.webpagetest.org/result/181031_TT_443f9d1e666d08f...
The article is probably <200 words, but the page is 3.2 MB and makes > 200 requests. There are 55 Javascript requests in there.
Right, HuffPo is egregious. How about a site HN readers might frequent, something lightweight like Reddit?
https://www.webpagetest.org/result/181031_H6_eaa2e64c9969515...
Random Reddit page, content: medium-sized image. 164 requests, ~12 seconds to render the page on the test rig. 60 Javascript requests. 1.5 MB of JS downloaded to display the post, a 46kb image. There's a 30:1 ratio of JS to post content (what the user wanted to see) here. And there's other bloat beyond the JS.
I just picked these two sites at random. You can do this all day with random websites. I would guess that most sites behind a .com (or .co.uk, etc.) will look similar.
I don't mean to pick on these sites, just wanted to point out that these practices are widespread and even reputable developers engage in them. Which will make it more difficult to undo the rot.