They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.
We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
Combined with CPU throttling, it's a decent sanity check to see how well your site will perform on more modest setups.
Developers really ought to test such things better.
We have some egregious slowness in our app that only shows up for our largest customers in production but none of our organizations in development have that much data. I created a load testing organization and keep considering adding management to it so they implicitly get the idea that fixing the slowness is important.
I had a fairly large supplier that was so proud that they implemented a functionality that deliberately (in their JS) slows down reactions from http responses. So that they can showcase all the UI touches like progress bars and spinning circles. It was an option in system settings you could turn on globally.
My mind was blown, are they not aware of F12 in any major browser? They were not, it seems. After I quietly asked about that, they removed the whole thing equally quietly and never spoke of it again. It's still in release notes, though.
It was like 2 years ago, so browsers could do that for 10-14 years (depending how you count).
Alternatively, run uBlock Origin and NoScript and you probably won't need it.
The single-line change of adding loading=lazy to the <img> elements wouldn’t fix everything, but it would make the page at least basically usable.
"Why do you want 64 GB RAM in your laptop?"
"I need that to load the gallery"
Marketing dept. too. They're the primary culprits in all the tracking scripts.
Marketing and managers should be restricted as well, because managers set the priorities.
There are good reasons to have a small cheap development staging server, as the rate-limited connection implicitly trains people what not to include. =3
I'm so happy to have seen their web site that I want to do business with them, even though I have no business to be done.
Bitlbee saved (and still saves) my ass with tons of the protocols available via IRC using nearly nil data to connect. Also you can connect with any IRC client since early 90's.
Not just web developers. Electron lovers should be trottled with 2GB of RAM machines and some older Celeron/Core Duo machine with a GL 2.1 compatible video card. It it desktop 'app' smooth on that machine, your project it's ready.
>I used the text web (https://text.npr.org and the like) thru Lyx
maybe you mean Linx or Links insted Lyx?
I was asked to look at the site when it was already live, and some VP of the parent company decided to visit the site from their phone at home.
[1] https://css-tricks.com/test-your-product-on-a-crappy-laptop/
This breaks my brain..."run around 750MB" "per website" "open once".
Do you mean your company had several websites (as in unique websites), and each had 750MB? Or the transferred data volume of each was 750MB? Or just the first page? Or the source code?
And do you mean "open once", as in "once upon a time", or if you open the website one time? Or did your developers open and shut websites, and the open ones had 750MB? Or one time when you entered your developer's office space, you saw they had opened several websites on their computers, random ones, and by coincidence you saw the network tabs and each was 750MB data transferred?
I wouldn't even guarantee it's developers adding it. I'm sure they have some sort of content management system for doing article and ad layout.
I chewed through 7Gb of data in about 30 minutes while working tethered to my phone.
You're not insightful for noticing a website is dog slow or that there is a ton of data being served (almost none of which is actually the code). Please stop blaming the devs. You're laundering blame. Almost no detail of a web site or app is ever up to the devs alone.
From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
If a bridge engineer is asked to build a bridge that would collapse under its own weight, they will refuse. Why should it be different for software engineers?
> From the perspective of the devs, they expect that the infrastructure can handle what the business wanted. If you have a problem you really should punch up, not down.
this belittles the intelligence of the dev team. they should know better. it's like validating saying "i really thought i could pour vodka in the fuel tank of this porsche and everything would function correctly. must be porsche's fault."
https://en.wikipedia.org/wiki/Conway's_law
Have a wonderful day =3
Just FYI how this generally works: it's not developers who add it, but non-technical people.
Developers only add a single `<script>` in the page, which loads Google Tag Manager, or similar monstrosity, at the request of someone high up in the company. Initially it loads ~nothing, so it's fine.
Over time, non-technical people slap as many advertising "partner" scripts they can in the config of GTM, straight to prod without telling developers, and without thinking twice about impact on loading times etc. All they track is $ earned on ads.
(It's sneaky because those scripts load async in background so it doesn't immediately feel like the website gets slower / more bloated. And of course, on a high end laptop the website feels "fine" compared to a cheap Android. Also, there's nothing developers can do about those requests, they're under full the control of all those 3rd-parties.)
Fun fact: "performance" in the parlance of adtech people means "ad campaign performance", not "website loading speed". ("What do you mean, performance decreased when we added more tracking?")
I didn't win that one, but I did make sure that it would only load after the user agreed to tracking cookies and the like.
And sure, better prioritization and cooperation with eng can make the “real” release processes work better for non-eng stakeholders, but “better” is never going to reach the level of “full autonomy to paste code to deploy via tag manager”.
This is the same reason why many big apps have a ton of Wordpress-managed pages thougout the product (not just marketing pages); often, that’s because the ownership and release process for the WP components is “edit a web UI” rather than “use git and run tests and have a test plan and schedule a PR into a release”.
I've had recently a case at work, while filling a contact form to add a new party there were 300+ calls to the validation service to validate email and phones. Three calls per every character entered to every text input!
GDPR-compliance is the first thing that goes out of the window, and with that conforming to the law, when in the EU. Ethics fly out of the window at the same time, or just slightly afterwards, when they add tracking, that no one agreed to, or when they forget to ask for consent, or when they have a "consent" popup, that employs dark pattern, or when they outsource consent to a third party tool, that informed visitors don't want anything to do with.
The discussions here about DNS-level blocking and Pi-hole are spot on. It's interesting that the burden of a clean reading experience is slowly being offloaded to the user's network stack.
No worries if not :)
Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.
We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.
I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.
My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).
I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)
Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.
Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
In the past some site had light versions, but I haven’t come across one in over 10 years
Makes me wonder if this isn’t just some rogue employee maintaining this without anyone else realizing it
It’s the light version, but ironically I would happily pay these ad networks a monthly $20 to just serve these lite pages and not track me. They don’t make anywhere close to that from me in a year
Sadly, here’s how it would go: they’d do it, it be successful, they’d ipo, after a few years they’d need growth, they’d introduce a new tier with ads, and eventually you’d somehow wind up watching ads again
If Al Jazeera or BBC had a similar text only site, that would be best. I really love the different perspectives.
I mostly use brutalist.report to find the articles, then deal with them on a case by case basis.
They also compress the hell out of the images, so it all loads shockingly well on poor connections.
They know this. They also know that web surfers like you would never actually buy a subscription and you have an ad blocker running to deny any revenue generation opportunities.
Visitors like you are a tiny minority who were never going to contribute revenue anyway. You’re doing them a very tiny favor by staying away instead of incrementally increasing their hosting bills.
I subscribe, and yet they still bombard me with ads. Fuck that. One reason I don’t use apps is that I can’t block ads.
It's closer to 30% that block ads. For subscription conversion, it's under 1%.
It's a large reason why the situation is so bad. But the internet is full of children, even grown children now in their 40's, who desperately still cling to this teenage idea that ad blocking will save the internet.
For a while it looked like companies were going to offer a good product at a fair price. I started getting a few subscriptions to various services.
Then all of those services got enshitefied. I got ads in paid accounts, slow loads, obvious data mining, etc.
Paying for services now often offers a degraded experience relative to less legitimate methods of acces.
This isn't a simple as it sounds, in fact it's rather complicated (far too involved to cover in depth here).
In short, ethics are involved (and believe it or not I actually possess some)!
In the hayday of newsprint people actually bought newspapers at a cheap affordable price and the bulk of their production was paid for by advertisements. We readers mostly paid for what we read, newspapers were profitable and much journalism was of fair to good quality. Back then, I had no qualms about forking out a few cents for a copy of the NYT.
Come the internet the paradigm changed and we all know what happened next. In fact, I feel sorry about the demise of newsprint because what's replaced it is of significantly lesser value.
In principle I've no objection to paying for news but I will not do so for junk and ads that I cannot avoid (with magazines and newspapers ads are far less intrusive).
So what's the solution? It's difficult but I reckon there are a few worth considering. For example, I mentioned some while ago on HN that making micro payments to websites ought to be MUCH easier than it is now (this would apply to all websites and would also be a huge boon for open source developers).
What I had in mind was an anonymous "credit" card system with no strings attached. Go to your local supermarket, kiosk or whatever and purchase a scratchy card with a unique number to say the value of $50 for cash and use that card to make very small payments to websites. Just enter the card's number and the transaction is done (only enter one's details if purchasing something that has to be delivered).
That way both the card and user remain anonymous if the user wishes, also one's privacy is preserved, etc. It could be implemented by blockchain or such.
The technical issues are simple but problems are obvious—and they're all political. Governments would go berserk and cry money laundering, tax evasion, criminal activity, etc., and the middlemen such as Master and Visa cards would scream to high heaven that their monopolies were being undercut.
In short, my proposal is essentially parallels what now exits with cash—I go to a supermarket and pay cash for groceries, the store doesn't need to know who I am. It ought to be no big deal but it isn't.
It seems to me a very simple micro payments system without name, rank and serial number attached would solve many of the internet payment problems.
Sure, there'll always be hardline scavengers and scrapers but many people would be only too happy to pay a little amount for a service they wanted, especially so when they knew the money was going into producing better products.
For example, I'd dearly love to be able to say purchase a copy of LibreOffice for $10 - $20 and know there was enough money in the organisation to develop the product to be fully on par with MSO.
Trouble is when buying stuff on the internet there's a minimum barrier to overcome and it's too high for most people when it comes to making micro payments (especially when the numbers could run into the hundreds per week).
I cannot understand why those who'd benefit from such a scheme haven't at least attempted to push the matter.
Oh, and that's just one aspect of the problem.
That's not true I had a subscription for multiple years. I canceled it because they
A. Kept trying to show me bullshit ads, B. The overall deterioration of the quality of the content especially the opinion section.
Websites that load a big JS bundle, then use that to fetch the actual page content don't get archived properly by The Wayback Machine. That might not be a problem for corporate content, but lots of interesting content has already been lost to time because of this.
Seems like a gross overestimation of how much facility people have with computers but they don't want random article readers anyway; they want subscribers who use the app or whatever.
No.
"savvy" web surfers are a rounding error in global audience terms. Vast majorities of web users, whether paying subscribers to a site like NYT or not, have no idea what a megabyte is, nor what javascript is, nor why they might want to care about either. The only consideration is whether the site has content they want to consume and whether or not it loads. It's true that a double digit % are using ad blockers, but they aren't doing this out of deep concerns about Javascript complexity.
Do what you have to do, but no one at the NYT is losing any sleep over people like us.
Likely not, but they are over their lost revenues. The profitability of newspapers and magazines has been slashed to ribbons over the past couple of decades and internet revenues hardly nudge the graphs.
Internet beneficiaries are all new players, Google et al.
Where do you trust to read the news? Any newsrooms well staffed enough to verify stories (and not just reprint hearsay) seem to have the same issues.
They are also not averse to using legal means to block them. For example, back when Microsoft shipped Windows Phone, Google refused to make an official YouTube client for it, so Microsoft hacked together its own. Google forced them to remove it from the store: https://www.windowscentral.com/google-microsoft-remove-youtu...
Only several days ago I watched the presenter of RobWords whinging about wanting more subscribers and stating that many more people just watch his presentations than watch and also subscribe.
The other problem YouTube has is that unlike Netflix et al with high ranking commercial content are the millions of small presenters who do not use advertising and or just want to tell the world at large their particular stories. Enforced DRM would altogether ruin that ecosystem.
News sites aren’t publishing their content for the warm fuzzy feeling of seeing their visitor count go up. They’re running businesses. If you’re dead set on not paying and not seeing ads, it’s actually better for them that you don’t visit the site at all.
Another quick point: my observation is that the worse the ad problem the lower quality the content is. Cory Doctorow's "enshitification" encapsulates the problems in a nutshell.
The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.
The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.
The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
Yes it does, from nytimes actual earning release for Q 2025:
1. The Company added approximately 450,000 net digital-only subscribers compared with the end of the third quarter of 2025, bringing the total number of subscribers to 12.78 million.
2. Total digital-only average revenue per user (“ARPU”) increased 0.7 percent year-over-year to $9.72
2025 subscription revenue was 1.950 billion dollars. Advertising was 565 million that includes 155 million dollars worth of print advertising.
Sure operating profit is only 550 million very close to the advertising revenue, but the bulk of their income is subscriptions, they could make it work if they had to. My suspicion is that if they dropped all the google ads they could have better subscription retention and conversion rates as well.
But at least in terms of the headline metric of bandwidth, it's somewhat less horrifying. With my ad-blocker off, Firefox showed 44.47mb transferred. Of that 36.30mb was mp4 videos. These videos were journalistic in nature (they were not ads).
So, yes in general, this is like the Hindenburg of web pages. But I still think it's worth noting that 80% of that headline bandwidth is videos, which is just part of the site's content. One could argue that it is too video heavy, but that's an editorial issue, not an engineering issue.
It’s a news site with a lot of auto-playing video. If you like that kind of content, great. If not, there’s lots of other websites with different mixes of content. I subscribe to the economist which has few videos and they never auto play.
But that’s a question of taste. 5mb of JavaScript and hundreds of tracking assets is not.
It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.
Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system.
Especially important for banking sites and webmail.
I didn’t bother validating this, but I’m sure they wouldn’t lie or misinterpret!!
JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).
Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.
Modern CSS (and some newer HTML features) also reduces the need for scripting.
I very much doubt that "Enabling scripting was a necessary step for interactive websites." (emphasis added). It may well have been the most convenient and fastest way to get the functionality to the most users. With Javascript each website could provide functionality without waiting for such to be implemented by all browsers.
However distribution of power also leads to more complex trust relationships (even if one is confident that sandboxing is effective). Independent implementation also leads to more complexity overall.
The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.
While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
Are the few cents you get from antagonizing users really worth it?
I suspect the answer is simple and that most users don’t give a shit
I think it has to do a lot with when you came of age - I’m in my late 30s, I got my first tech job at 14 as a sys admin for a large school district, and every single developer, admin, etc that I knew was already going on about the free internet. As a result, I’ve never had a tolerance for anything but the most reasonable advertisements
I think that ideology is necessary to care enough and be motivated enough to really get rid of ads, how fucking awful the websites are alone should be enough but for most people it isn’t
A comparison of CPU usage for idling popular webpages: https://ericra.com/writing/site_cpu.html
Regarding tracker domains on the New Yorker site: https://ericra.com/writing/tracker_new_yorker.html
I've tried all of the auto discard type extensions but yeah back on CPU usage is crazy
How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
You can't beat China Southern . They have the most dog shit website I've ever seen. The flight was fine but I gave up doing online check in after 3 attempts. Never mind the bloat:
- required text fields with wrong or missing labels. One field was labeled "ticket no.". It kept getting rejected. I randomly tried passport number instead. It worked.
- sometimes fields only have a placeholder that you can't fully read because the field has not enough width ("Please enter the correct...") and the placeholder disappears once you start typing.
- date picker is randomly in Chinese
- makes you go through multi step seat selection process only to tell you at the end that seat selection is not possible anymore.
- signed up with email; logged out and went back to the SAME login page; now sign up via phone number is required!?
Loudly oppose the trendchasing devs who have been brainwashed into the "newer is better" mindset by Big Tech. I'm sure the shareholders would want to reduce the amount they spend on server/bandwidth costs and doing "development and maintenance" too.
Simple HTML forms can already make for a very usable and cheap site, yet a whole generation of developers have been fed propaganda about how they need to use JS for everything.
Or for developers to pad their CV.
Google Reader was never the answer. It's such a shame that people even here don't realize that relying on Google for that had interests at odds - and you weren't part of the equation at all.
Well, except for your data. You didn't give them enough data. So they shut down shop. Gmail though, ammirite? :D
Yeah I wonder why gmail was not one of the shut down products /s
4 MB was an absurd size for a website in 2008. It's still an absurd size for a website.
You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.
It's second nature to open all these websites in reader mode for me atp.
Running this on the dev console makes it snappy again:
let allElements = document.querySelectorAll('\*');
allElements.forEach(element => {
element.style.filter = 'none';
element.style.backdropFilter = 'none';
});I should really run some to tests to figure out how much lighter the load on my link is thanks to the filter.
I also manually added some additional domains (mostly fonts by Google and Adobe) to further reduce load and improve privacy.
Not done any rigorous tests but my experience has been rhat it can be lower than a tenth.
Nice term!
> Or better yet, inject the newsletter signup as a styled, non-intrusive div between paragraphs 4 and 5. If the user has scrolled that far, they are engaged.
They're engaged with the content! There is no way to make some irrelevant signup "non-intrusive". It's similar to links to unrelated articles - do you want users to actually read the article or jump around reading headlines?
There have been countless businesses that have lost out on my money because I clicked on their ad (good job!), start reading the product info on their website as it loads (that's basically a sale), and then the page finished loading with a barrage of popups for cookie consent, newsletter signup for a discount code, special offers, special sale, spin the wheel for a prize, etc. That's when I close the tab and forget about it.
Luckily I use a proper content blocker (uBlock Origin in hard mode).
So much hostile user design.
Edit: NPR gets a little shout out for being able to close their annoying pop-ups by clicking anywhere that's not the notification. So it's still crappy that it hijacks the screen, but not awful I guess?
If people tune out only because how horrible the sites are, good.
Someone is serving this amount of data to every visitor. Even if you want to track the user as much as possible, wouldn't it make sense to figure out how to do that with the least amount of data transfer possible as that would dramatically reduce your operating cost?
Perhaps size optimization is the next frontier for these trackers.
they won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
Apps don't have adblockers.
And most people don't even use adblocker when browsing normal site. I kind of had to tech my surrounding people to use adblocker.
For every 1 engineer it seems like there are 5 PMs who need to improve KPIs somehow and thus decide auto playing video will improve metrics. It does. It also makes people hate using your website.
I would constantly try to push back against the bullshit they'd put on the page but no one really cares what a random engineer thinks.
I don't think there's any real way to solve this unless we either get less intrusive ad tech or news gets a better business model. Many sites don't even try with new business models, like local classifieds or local job boards. And good luck getting PMs to listen to an engineer talking about these things.
For now, the bloat remains.
The answer is really simple and follows on from this article; the purpose of the app is even more privacy violation and tracking.
And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
So they could do exactly what they are doing on the web and may be even more but with Native code so it feels much faster.
I got to the point and wonder why cant all the tracking companies and ad network just all share and use the same library.
But on Web page bloat. Let's not forget Apps are insanely large as well. 300 - 700MB for Banking, Traveling or other Shopping App. Even if you cut 100MB on L10n they are still large just because of again tracking and other things.
Vote with your behavoir. Stop going to these sites!
Want to bet 100 MB? 1 GB? Is it unthinkable?
20 years ago, a 49 MB home page was unthinkable.
I think there'll continue to be growth in page sizes, but then maybe we'll consider efficiency, or the NYTimes shuts down and the 20MB page will be the liquidators selling the domain. Maybe we don't even use domains by then as everything is on an app.
> Today, there's ~30 times more js than html on homepages of websites (from a list of websites from 5 years ago).
It seems that this number only go up.
Maybe you'd need one chart for request counts (to make tracking stand out more) and another for amount of transferred data.
This site was created to give developers and pms some ammunition to work on improving load speed
The same phenomenon worsened during the DotCom Meltdown and the Great Financial Crisis. This accelerated desperation is a sign of the times; paying subscribers are likely cancelling due to current economic conditions.
[0]: https://svelte.dev/
Of course not. Its all about maximising shareholder value. The users aren't a consideration anymore
with almost all options and filters enables ofc
I am considering moving to technitium though, it seems better featured.
example 'sponsored listings' on amazon.
plenty of ad-networks that provide that 'native' experience
To make it somehow worse...when you scroll down, you think it would leave you as it leaves the viewport. No. It detaches, shrinks and pins itself to the bottom right of your screen and continues playing. It keeps the distraction going and as if teasing you, features a microscopic 'X' button with a tiny hit area (violating Fitts's Law)."
Is there not way to stop this? The do not autoplay videos option often does not work.
||primis.tech$domain=~primis.tech
Primis is one of them, but there are a few of these companies. I can't remember them all.It’s as if everyone designed their website around the KPI of irritating your visitors and getting them to leave ASAP.
They thing is, though... they _don't_ have to. It's been my standard practice for years to just tap ctrl-w the moment any web page pops up a model box. Some leeway is given to cookie dialogs _if_ they have a disagree/disable button _prominently_ visible, otherwise they're ctrl-w'd too.
"Newsletter..." ctrl-w.
"Please disable your..." ctrl-w.
"Subscribe to read..." ctr-w.
Ctrl-w is your friend.
They also have their own tiktok and AI slop that I never knew about.
I’m afraid someone who wants to support professional journalism and agrees to pay ~$300/yr for an NYT subscription still gets most (all?) of this nonsense?
We put up with the nonsense because we had to. But the irony is content creators are actively driving their audience away.
I see three options:
1. Show me reasonable ads and I will disable ad blocking
2. Do the crap described in this article and don't complain when I arm myself against it
3. Do a hard paywall and no ads; force me to pay to see your content
If you need a CDN and half a browser runtime just to show 800 words about celebrity nonsense, the business model is broken. Everyone else is footing the bandwidth bill for nonsense they never asked to recieve.
I mean, they can absolutely try. That doesn't mean they should succeed.