I get what your advice is about but to add some nuance which didn't cover... you should consider that I learned of Alec's Technology Connections channel 9 years ago because the Youtube algorithm suggested it to me.
Why did Youtube do that? It was because I had watched Ben's Applied Science excellent video showing vinyl grooves under an electron microscope: https://www.youtube.com/watch?v=GuCdsyCWmt8
So the first Alec video I got exposed to was his related topic on vinyl records (click "Oldest" to see them) : https://www.youtube.com/@TechnologyConnections/videos
I'd argue that the Youtube algorithm is very good at finding adjacent videos of interest especially in educational topics and DIY repair tutorials.
You're suggesting people go to Youtube subscriptions feeds but people have a list of favorites in their subscriptions often because of the algorithm. There's a bit of chicken-vs-egg situation going on there.
What a good algorithm does is help users with the Explore-vs-Exploit tradeoff: https://en.wikipedia.org/wiki/Exploration%E2%80%93exploitati...
- Explore --> Youtube algorithm sidebar recommendations of related videos.
- Exploit --> add a worthy creator to subscription feed and get alerted to new releases from that person
The "explore" part is helped by algorithms because they can suggest videos you would have never thought of because you don't know the keywords or jargon to type into a Youtube search box to get to it directly. "You don't know what you don't know."
But don't use the algorithm for politics or click on anything that has a thumbnail with the shocked Pikachu face. That just starts a feedback loop of crap.
Arguably, the algorithms could put one into a non-productive engagement loop never to escape. Personally, I don't think it's a big risk for educational/DIY topics because your brain gets saturated with "too much information" and hits a stopping point where you don't want to learn any more.
So... Algorithms can be bad ... but you can also make them work for you.
I never go to my subscription feed - the front page algo keeps me up to date on any new content from people I want to see updates on. I’ve noticed too it almost has a “shadow subscription” where even though I am not subscribed to certain channels, it knows I watch every video by them so it gets on my front page too.
The front page really has a “vibe” that follows my interests around. Watch a few too many Minecraft videos or car repair and soon you start seeing more and more of the front page being those topics. Get a new interest in pyramids? Devlogs? Nature? The front page slowly decays old interests and promotes new ones.
Which is again why I don’t check my sub feed - it’s a graveyard of interests, many of which I don’t care about right now. The algo surfaces the ones I do.
In my experience it's "watch one video outside of your recommendations and then half your next set of recommendations will be related to that". I'm scared to click on anything I'm not already subscribed to for fear of trashing the home page.
I feel like clicking a video and immediately clicking off is also a negative signal they use but YMMV.
I get recommended right leaning videos and videos with ads for manscaped and I'm neither a conservative or a man. It's super weird so I tend to separate my interests into two apps: the YouTube web app for "junk food content" and FreeTube when I want to learn and focus. It's the only way I've found to not be fed the random content carrots while falling down the rabbit hole.
Right now my homepage seems to be
- construction/DIY videos (Perkins, B1M, Megaprojects, Matt Risinger, NS Builders)
- video game dev (blackthornprod)
- "indie game of the day" channels (Aliensrock, Nialus)
- military videos (Battleship New Jersey, Ryan McBeth)
- freerunning / urban exploration (STORRER)
- movie & tv analysis / commentary (Frame Voyager, Corridor Crew, New Rockstars)
- chess (agagmotor, Magnus)
- Minecraft (Mumbo Jumbo)
- random documentaries (fern, Stewert Hicks, Half as Interesting)
- egypt / pyramids (History for GRANITE)
- science / engineering (Adam Savage, Colin Fruze, Applied Engineering)
- coding (Tsoding)
From just a quick scan of the topics / channels.
Youtube wants my money. They will never get my money when they come up with things like that. I will give them my money once they start cracking down on ads. And by that I mean actual moderated ads - not random ads with porn. As long as they serve scam ads I will never give them money - and it does not look like I will in my lifetime.
I'd rather use a lens more like all the open-source/free-software concerns about controlling your own computer:
1. Can I see how the recommendation algorithm is intended to work? The site-owner says it works for my benefit, but what if they're mistaken, or lying?
2. What has it recorded about my interests, and how can I fix bad records that don't represent them?
3. When it's not working well--or harmfully exploiting my baser weaknesses--how can I change to a different one?
"Whose problem is it that it solves?"
It's possible to get some benefit from an algorithm/process, just as a side effect, that was never designed to work in your interest and is an opaque cloud service. Maybe the service is solving the network owner's problem of selling you to advertisers. If you want to maximise for "interest and relevance to my life goals" there's nothing to stop you running your own "algorithm" of course, except any obstacles put in your way by the data network owner. For that reason it's more important to pay attention to the freedom of the network (open API, federated, maximally distributed etc) than the algorithms that run on it. If you control the former you control the latter. HN (the network) seems to allow a lot from the plethora of viewers I've seen.
I also read "Technopoly" recently, and while it didn't have quite the same impact on me, I can't deny that it accurately describes the techno-political moment we're currently living in. Well worth the time.
It probably helps that I only permit a handful of specific topics: physics, fun math, synthesizers (but not modular), tiny bit of music theory/training, StarCraft 2 (not SC1/BW), and recently the Nvidia/AMD GPU release saga.
This may also be an artifact of the fact that you are the sort of person who seeks out educational content. I.e. you have a high need for intellectual stimulation. That makes you an outlier among all people who use social media.
Personally I think technical people underestimate the negative impacts of the models that drive the algorithms. We are basically training humans via a reward function that maximizes watch time. We are also heavily correlating errors in knowledge because popular stuff gets boosted so much. Correlated errors are bad for rubustness.
In my personal experience, "edutainment" can certainly be addictive, and more often than not, consuming it is "unproductive" because (1) consuming content aimlessly is intrinsically mostly passive, (2) passive consumption is ineffective for retaininig knowledge or building understanding, (3) content is often superficially interesting because of a spectacular and/or highly simplified presentation.
This is only a counterpoint to the idea that educational content is limited in its potential to be addictive/unproductive; there is still, obviously, a great positive potential to high-quality educational content.
On a related note, for those looking to access diverse content without regional restrictions, reliable proxy services can be invaluable. NodeMaven offers high-quality proxies that ensure secure and unrestricted browsing. I can drop a link for everyone as Iir realy helped me during mu thesis https://nodemaven.com/proxies/residential-proxies/.
Algorithmic feeds don't give us that opportunity - they're designed to require minimal effort and to keep the dopamine coming without any conscious decisions.
I have no complaints about my Instagram and YouTube feeds. They give good recommendations.
TikTok in particular sneaks politics into everything. Even if it's not explicitly political.
I asked Deepseek once to walk me through what it knows about TikTok and it claimed the Chinese version uses an RL approach to sprinkle socialist core values into your feed even if you explicitly don't want politics. It also claimed TikTok absolutely promises it doesn't do this in the US. I'm not really convinced Deepseek knows what it's talking about but it was pretty plausible technically.
But in practice it's easy to tell if someone even in the US spends a lot of time on TikTok base on their strongly held opinions even when they explicitly say they never watch political content.
I doubt other social media companies do this because they aren't created specifically for political propaganda like TikTok is, but it's possible they do.
We all know that gambling addicts exist and how destructive it is to their lives, the casino exploits behaviors and gets all their money. As a result people know casinos are dangerous, reasonable people avoid them, are warned about them, and the government forces regulation to reduce their ability to exploit vulnerable people.
Imagine if none of these controls existed and nobody talked about or generally knew that casinos were dangerous. Imagine if the casinos were 100x better at exploiting you and you were forced to walk through a casino every time you leave your house. You’d get a lot more people having their lives destroyed.
So what this video tries to do is important, naming the term, “algorithmic complacency”, allows it to be recognized, discussed, and actively kept in check by users. Ideally regulated by the government as well, just like casinos.
The casino also provides a service, entertainment, there’s nothing wrong with a reasonable person attending, spending some money and being entertained. But we as a society recognize that a company exploiting behaviors to get all of a person’s money, is bad, and try to limit that negative outcome even though we still allow casinos to exist.
Time, attention, and focus is so abstract people don’t even realize they’re spending it, or how modified their behavior has become because of the algorithm’s exploitation. As a result we let companies who are 100x better at manipulation than casinos operate without so much as mentioning they’re doing it, and steal increasing amounts of a user’s time.
Google, Facebook, and the other algorithm-driven tech companies have been aggressively enshittifying their products at least since 2020. "I got fun/useful videos out of the YouTube algorithm in 2016" says nothing about what that algorithm is like in 2025, given that they can change it silently on a whim.
Something that will filter out the anger, but keep the insight. I vaguely remember someone posting about a tldr for twitter. Anyone know of tools like that?