https://www.penny-arcade.com/comic/2023/12/01/algo-rhythms
> Gurb turned me on to a kick-ass book called "The Mysterious Case Of Rudolph Diesel," and I think you should read it if you're interested at all in the world, but you should buy it with cash in a town you don't live in and read it in a dimly lit cavern. Because if you don't, if The System finds out you read a book about a fascinating historical character and his mysterious disappearance, you'll be clocked immediately by their tendrils as… whoever this is.
"Targeting" isn't the right word, "statistics" is. These are statistics based, data driven, mechanisms that do their best to find things that give the best chance of you positive interacting with them, with positive meaning money going towards the people behind the algorithm.
It might be "statistics based, data driven, mechanisms" that cause a fisherman to bait a hook, but he's still out there targeting fish and waiting to see which ones bite.
For the record, I don't buy the argument that youtube's algorithm can radicalize most people. For example, most people who aren't Nazis could listen to Nazis talk all day long for years and never start hating other people because of their race. People who study/track hate groups do exactly that without issue. No amount of listening to Nazis will suddenly turn them into one.
The problem is that there exists a small number of people who are genuinely vulnerable to that stuff and other people who can be influenced by some extreme ideologies after being desensitized by floods of even more excessively extreme ideas. When you're exposed to an endless steam of totally insane views, the mildly crazy ones can seem saner than they would have otherwise.
I don't think that youtube should censor content out of fear that some people will be vulnerable or desensitized to extremism, but it'd be nice if youtube didn't explicitly push it on people who never asked for it in the hopes that some of them will get suckered just because that drives up engagement.
You haven't seen extremism in from those media sources because relative to your perspective, they aren't.
It seems like you could define some idea of "depth" into a topic (based on how far out of normal viewer's patterns it is), and only generate recommendations for items that aren't far outside of the norm, but this would lead to a lack of depth for recommendations in niches.
Maybe a middle ground would be to treat sensitive topics differently in terms of "vertical" recommendations, by e.g., explicitly marking some categories as safe and enabling recommendations to go deeper, but only allowing "horizontal" recommendations for unknown topics, and maybe preventing recommendations "into" that topic from the outside.
So... if you're watching train videos you might get to see even more niche ones, but welding won't recommend for you fox news, and watching fox news won't show you Alex Jones recommendations.
I pick on the right here since it's in the topic (and I'm left leaning myself), but I think radicalization is an issue on the left as well (though frankly my political opinions make me think it is less impactful there, mostly because of the way people radicalize on the left I believe tends to impact less marginalized people or be in terms of policy rather than affecting people that are already beaten down).
That’s what left-wing extremism is.
I'[ll pick on Ivermectin through COVID as an interesting case. Now, obviously, if you have 2 groups and one has parasites but the other doesn't then the parasite-free group will get better COVID results. So as expected, people treated with Ivermectin got better COVID outcomes.
It took a long time to get the message out to explain that effect because in the spheres I listened to everyone who pointed out the statistically significant result got shut down with logical fallacies. Conspiracy theorist was definitely one.
I'd rather be completely correct, but I'm happy to fall for the occasional conspiracy that is backed by statistically significantly evidence. People who fall for that sort of mistake are going to get better results long term than people who ignore evidence. But this study would classify that sort of evidence-based reasoning as a right-winger being led into extremist conspiracy content. I mean, I dunno. A branch of the right wing believes in looking at primary evidence. That means they get things wrong, and sometimes right, in ways out of sync with the mainstream conversation.
As long as there are paranoid people believing in crazy things without evidence (and often in opposition of actual evidence to the contrary) and others taking advantage of those people, the idea of "conspiracy theory" will flourish.
We just have to understand that not all conspiracy theories are equal, and that legitimate concerns can get dismissed as being a "conspiracy theory". We'll always have to evaluate theories involving conspiracies according to the evidence we have and decide for ourselves how likely they are and which ones are worth our time/energy.
I suppose the conspiracy part is would guessing how common it is?
The actual goal on the part of Youtube is not political activism but engagement for ad views. Polarizing content achieves that for people who want it and people who hate it.
As I am absolutely sure it also leads to extreme left wing content.
This public admission is a great start; keep going.
The real question should be, should we prevent this type of content from getting recommended, and where are the lines?
Has a side note, I'd love to see a Twitter style Community Notes be implemented on YT. It's the one good feature Twitter has implemented in a long while. And yes, YT has Notes, but they're done by YT themselves (the COVID ones for example).
If you're specifically looking out for a long list of right wing extremist content categories, but only one category of left wing extremist content is there any wonder that you'd find that youtube pushes people to extremist right wing stuff to a greater extent than they do the extremely limited left wing extremist content being considered?
- MSNBC
- Senator Bernie Sanders
- Elizabeth Warren
- Vox
I mean, I suppose it is understandable if your political experience is solely American. But I do wonder if one considers these "very left", what will happen if they come across political concepts such as Anarchism? If they read Malatesta's writings, for example, would their minds just explode?
[1]: https://www.pnas.org/doi/10.1073/pnas.2213020120#supplementa...
They defined problematic channels as anything specifically espousing far-right wing ideas, and found that ring-wing users were only-slightly more likely to be recommended content from them.
It's kind of disappointing they couldn't find something problematic or conspiratorial from the left, even just for the sake of comparison.
This depends on the researcher's definitions of 'extremism' and 'conspiracy theories'.
- Recently we've seen many left wing people state that disassembling people in front of their families - surely an 'extreme' act - is a 'beautiful act of resistance', and that calls for genocide against Jewish people (surely also 'extreme') may not constitute hate speech in some contexts.
- For the last 7 years we've had many people believe in the Russiagate conspiracy theory.
- I'm not sure "problematic" has any real meaning.