In the end the core problem with the recommendation system is simply that there is only one list of recommendations, which makes it damn near impossible to switch to a different topic unless you start account hopping. The new topic-bar helps a little, but still offers no way to get rid of an unwanted topic completely.
But on a useful note, maybe YouTube needs to provide a function to see a list of random videos that are unexplored or new.
There seems to exist a downward spiral where content that is prioritized to be attention grabbing over all else gets the most clicks and then drives more content to focus on this. The quality of the content itself seems to be increasingly inconsequential and has fallen off a cliff in the face of click spam.
Anecdotal evidence 1: I used to watch youtube videos for investment ideas and now all of these streamers pump out the same low quality, not very researched content with their face plastered on top with a shocked look on their mug.
Anecdotal evidence 2: My son was shown youtube by one of our nannies (despite my rules on no youtube) and has since become semi-addicted to the most brain dead content of toys being made into knock offs of the reasonable shows he used to watch or some adult man child being a weird clown in public.
Idiocracy will be due to memes as much as genes.
Directly to Susan Wojcicki's face in a live public interview: https://youtube.com/watch?v=zKrQzJgFWdw
(Starting about 1:50 into the interview.)
Also numerous times in her column and podcast (I can chase those down if requested, though they should be prominent).
See "Algorithms Won’t Fix What’s Wrong With YouTube" https://www.nytimes.com/2019/06/14/opinion/youtube-algorithm... (https://news.ycombinator.com/item?id=20184282)
Guess it had good acting, surely no one in their right mind is that excited to eat 5 kilos of meat with water in 5 minutes or whatever...
Just kinda felt disgusting, why does this have a million views? This guy's channel was all about eating shit fast... Wat
It doesn't have any "the key words I'm interested are" section. It doesn't have any fine grained "show me less like this" or "show me more like that" options.
That all the filters of net (Google, FB, etc) have convinced users that an out-of-their control recommendation system is a triumph of this sort of system. If you just demand the recommendation system be different, you're just asking users be controlled in a different way.
Edit: whenever I mention this, I get "but the users are morons and need to be lead". I'd say instead, users are often unthinking and need to be educated.
Oh, is that all?
Anecdotally (n=1, I know), I watch/listen to a lot of leftist lectures and speeches and my feed does not get inundated with them nearly as much... Really makes you think.
The problem is that the right is much better at producing flashy, low-attention/low-commitment content - the kinds of things that excel in the metrics the algorithm would naively optimize for. Case in point, the lectures you watch are nearly the exact opposite of that popular, easily-digestible content.
Now they removed movie and left there only very bad unwatchable commentary of it. Let's pretend it solves the problem.
I watched one video on preppers out of curiousity - and my feed is now full of videos about machine guns, shot guns and the correct military gear to pretend you are in the army.
But all of that pales in comparison to their Tik-Tok clone attempt of short videos. Talk about "chum"! They're clearly highly optimised to pull people in and get them hooked.
Luckily, I know digital drugs when I see them. I was one of those freaky people that could recognise the "hooks" in games like World of Warcraft and simply stop playing before the "endgame" mechanics sucked your life down the drain just so that Blizzard can get their recurring revenue.
Seems like this data driven optimisation for addicting people to low-quality content is the mental opiod epidemic of the current era...
It is not even a separate expense for me, as I just use my running shoes as my regular shoes. Also, maybe it is my gait or my lack of weight, but I estimate my shoes last 1k miles.
Since Google/YT essentially are shaping you thoughts and behaviours, logical optimization for them is to shape you to be a working sportsman that spends all disposable income on services, experiences and goods bought online.
me: art, philosophy, religion, culture -- avidly
me on youTube: Rizin/UFC re-runs, live concerts of commercially successful bands, D&D themed games
On a related note it would be interesting if someone has made some "rabbithole resolving" software - I've noticed that these things terminate in extremely similar ways for most people, and have had a number of conversations laughing with others about how we've gone down the exact same content route. Write some code to find the bottom for us and save all the headache (fun)!
Seriously? Is it 'telling 'cute' stories to distract from the black box youtube algorithm that creates white supremacists' -time? It comes across as incredibly unempathetic, gaslighty and dismissive, posting on a thread that is discussing an important societal problem; especially using the word 'dangerous' in the sarcastic way in which you used it here.
It's frustrating that the readers themselves have to replace YouTube's cryptic and doublespeak-y 'borderline' label in this article with the word 'white supremacist' and/or 'alt-right' to get the real picture.
Way way more recommendations for stuff I'd never consider watching, recommendations for "big" youtubers (I almost exclusively watch smaller channels based around niche hobbies), stuff like that.
I used to think their recommendation engine was amongst the best of any service I've used, now it's junk.
In fact, I rarely find YouTube's recommendations to be useful. Occasionally, yes, but typically I just get that "more of the same" problem.
However, that may be explained by how I use YouTube. The article is pretty vague and hand-wavy about how the recommendations really work, but it's possible that the fact that I don't drive my YouTube viewing from the recommended videos list makes the video recommendations worse for me.
Instead, I have to explicitly go into "Subscriptions", and hope that it was posted recently-enough for the channel to show up in the 6-or-so sorted-by-recency channels at the very top, rather than in the alphabetized list (since there are so many channels in that alphabetical list, that finding anything marked as new in that list is basically impractical.)
This is how I've always used YouTube, and it wasn't that long ago that I learned most people don't do it that way. My usual pattern is to go to the subscriptions tab and scroll down to where the last video I viewed was, then start watching from there up towards the top of the list.
It would be really handy if the YouTube app had some way of making that easier.
At this moment, my homepage's first 10 videos include 7 that I've already watched.
It also seems to have some classification issues, too. It lumps a lot of stuff into "DIY", and woodworking is part of it--but so is a lot of stuff that more fits under "construction" or "carpentry". No shade thrown, that stuff can be interesting too, but there's a large gulf between "crotchety woodworking dork going on about tablesaw safety" and "refinishing a backyard shed with glamour shots of a Home Depot sponsored miter saw set to bouncy stock music" that the algorithm does not seem to grok. Maybe it wants me to hatewatch that stuff, I dunno.
However, it becomes this echo chamber of only recommending the same content over and over again because they don't want to suggest something you don't like. That's because to their metrics, you don't watch it all, you don't see the ads, and then you "potentially" leave their platform because there's "nothing you like on it".
To pick an example, my YouTube recommended section is full of videos of people reacting to the things I actually like to watch. I don't want to watch people reacting to the things I like. I just want to watch the things I like. I try to tell YouTube that by saying "Not Interested" to all of the reaction videos, but it doesn't seem to make a difference.
SELECT DISTINCT videos.*
FROM videos, followers
WHERE videos.creator=followers.creator
AND followers.user = $MEWe hypothesized this might be because you're less likely to be hooked yet, so it wants you to really get the good stuff and hope that you then want to subscribe to or comment on something? And once you're signed up, you might stay on the site longer if you need to keep looking for stuff you like and generate more ad impressions.
I should really do a comparison logged-out vs. logged-in because I could swear those ads are different amounts and durations. Also it's percentage-wise about as much as the free legacy TV channels, except of course YT doesn't need to pay the production company -- ShadyVPN already takes care of that.
I remember about a year ago, it got into a weird state where I watched an old Cat Power music video, then every time I looked at recommendations on other videos it was recommending it, again and again... I'd watch the video again to hopefully make it go away, and it would still keep recommending it. It was very strange. I ended up just choosing "Don't recommend this" on it.
Of course, that'd be a pain to manage, so I haven't gotten around to it.
Their recommendation system is really the heart of the beast. It's essentially a dopamine reward system, not too dissimilar from a slot machine at a casino. Pull the lever (refresh the page) and get random results! It sucks you in to a cycle of refreshing, finding videos you're interested in, clicking, rinse, repeat.
Personally I found myself losing hours per day to this behavior and gaining very little from it.
Far more productive, is going to YouTube for something SPECIFIC. Use the search function to find what you're looking for, and watch those videos, but never use the homepage or sidebar recommendations.
There are actually google chrome plugins which will totally remove the recommended section:
https://chrome.google.com/webstore/detail/unhook-remove-yout...
I get what you're saying about the reward system, but I normally run out of content I want to watch and recommendation engine isn't really suggesting anything interesting. That's after watching a few videos every other evening.
There's a few channels I subscribe to, and I do regular clean up, like twice a year, and remove the subscriptions I'm no longer interested in. YouTube mostly shows me the new videos from those channels an a few similar videos and then kinda gives up.
Two thing I would not recommend is using YouTube without a subscription and be sure to log in. The default videos on the front page are all terribly, and the ads are amazingly bad. In regards to the ads, I think that people in smaller markets are a problem for YouTube. They simply don't have enough ads so they can't actually target them in any meaningful sense.
Unhook for Firefox: https://addons.mozilla.org/en-US/firefox/addon/youtube-recom...
Well then I guess that I'm lucky that the recommendation system almost entirely fails to find videos that I'm interested in!
Unfortunately the search is far from independent of the recommendation algorithm.
Granted, I've never watched much YouTube, but that curbs it even more.
I'd sometimes forget what I came for based on the recommendations from the homepage. Not that I was logged in and saw interesting stuff -- not even -- just the sheer uselessness of what people apparently watch the most (clickbait titles, soccer match replays, gasping surprised face with all-caps title, you know the type). Adding that part of the homepage to the ad blocker, using the pipette tool in uBlock Origin, helped a lot. Or add the youtube search to your address bar (firefox: add keyword search, or go via ddg bang commands) so you don't have to open the homepage to get a search box.
I've found myself doing this more recently, but I also like exploring. I think my goal is to not start from recommendations, but a specific curiosity, and roll from there.
Then again, there are plenty of creators that will bring me things I never even thought to consider. 3blue1brown was mentioned in the article, and I have notifications turned on for them because I know for a fact I will watch the entire video.
youtube.com##ytd-browse[page-subtype="home"]
youtube.com###related
youtube.com##.html5-endscreen
The first one, removes the homepage, the second one the list of videos when on an already existing video, the last one, the end screen tiles.I don’t get that same rush, you describe. Many short videos are the problem?
Please don't make low effort posts like this here.
1. If you don't have your YouTube activity history enabled, the algorithm doesn't remember what you've watched and you'll probably get clickbaity/junky recommendations.
2. If you have set your history to expire after some amount of time, it will start re-recommending videos you've already seen once it has forgotten you have watched them.
You can set both options here:
It would be nice if Youtube offered a feature where the recommended lists are completely removed and youtube simple loads as a search bar. Where you find what you are looking for without distractions.
I've done this myself by adding the following rules to ublockorigin:
www.youtube.com##ytd-watch-next-secondary-results-renderer.style-scope.ytd-watch-flexy
www.youtube.com###comments.ytd-watch-flexy.style-scope
www.youtube.com##ytd-browse.ytd-page-manager.style-scope[page-subtype="home"]
www.youtube.com##ytd-guide-renderer
www.youtube.com##ytd-vertical-channel-section-renderer
It wouldn’t make sense for them to show objectively “crappy” content to every new visitor, just to get OP to turn on their recommendations. What we consider crappy is what other people consider entertaining.
1 Iv had my YT history disabled for >5 years now. Recommendation still works just like for everyone else.
2 Even when enabled it will always keep recommending videos you already watched.
I’m uncertain on how cookies are maintained. Are these Safari cookies behind the scenes?
My point is, contrary to GP’s claim, there is some mechanism for viewing history even if it is turned off.
If you are not logged in you can't do this, for obvious algo manipulation reasons.
I keep getting old furniture restoration videos in my feed. I've seen them already. How about different channels on the same topic with new videos posted?
They probably hate this idea but who knows, that might even increase vid views.
> 2. If you have set your history to expire after some amount of time, it will start re-recommending videos you've already seen once it has forgotten you have watched them.
The meaning of "recommendation" seems to have shifted from "what we think you want" to "what we want for you."
It's not a wrong use of the word, just different from what you might expect.
I think this is a pretty fantastic summary of what has gone wrong with a lot of big tech. Facebook, Twitter, youtube, Google search, iOS, Android, and Windows all have moved toward taking away user control and trying to move you toward more of what they want you to want. If it's not recommendations, it's forced updates, device scanning, or "influencers".
We badly need some of that disruption these same companies used to preach about. The current cohort is becoming ossified and user hostile, just like the companies they replaced.
I found that a 50/50 mix of composted coffee grinds + pure sand made a great growing medium for potted peppers.
Just because I click on a video and watch it entirely doesn't mean I want to see more like it, and it doesn't mean it was healthy for me to watch, it could have just been a distraction. But now this system will latch on to my need for distraction and keep nagging me with stuff in the long term that is ultimately harmful to me.
I don't believe recommendation systems like this can ever really understand people well enough where a person can walk away and feel healthy about it. But they are great at generating money and that is why they exist. I can see and appreciate the amount of effort that is trying to make these systems better but they are fundamentally flawed and limited.
Also they say >borderline content—that is content that comes close to, but doesn’t quite violate our Community Guidelines.
TOS can not treat anything as "borderline"; borderline is akin to robbing a bank and then telling a judge your crime was "borderline" because you were out of money and you needed to get your financial situation straight by robbing a bank. My point is TOS and Law are binary either something is allowed or forbidden there is no middle ground.
They addressed this, I think, later in the article
> With all that, why don’t we simply remove borderline content? Misinformation tends to shift and evolve rapidly, and unlike areas like terrorism or child safety, often lacks a clear consensus. Also, misinformation can vary depending on personal perspective and background. We recognize that sometimes, this means leaving up controversial or even offensive content. So we continue to heavily focus on building responsible recommendations and take meaningful steps to prevent our system from widely recommending this content.
So much, in terms of moderation, what is acceptable, etc. is a constantly moving target. Even if there is an exactly right answer today, it might become an important conversation to have tomorrow
That's wrong you can not make ad hoc decisions without any forethought only in order to please someone or something within or outside the company(organization). I will again refer to the law; law is built on concepts and premises that have foundation in morale for example look at the US constitution or any constitution in the world.
5 years ago I use to watch WW2 documentaries on YT and I regularly stumbled upon neo-Nazi content, nowadays I can barely see any but together with neo-Nazi content being removed they removed legitimate WW2 documentaries that were meant for education purposes only I suppose because some of them used Nazi symbols and/or Nazi speech footage.
So 5 years ago neo-Nazi content was acceptable but today it is not?! It doesn't make sense to me.
They want to be politically correct that's why they change their community guidelines and TOS non-stop.
Chicken/egg. Above is true is because they flood the sidebar with algorithmic recos instead of your subscriptions, and have tuned their notifications to not give you as many, even if you are subscribed and even if you click the notification bell.
According to the blog post, YouTube is just like a public library. Nevermind the absolute absurdity of comparing reading books belonging a large library to watching YouTube videos. Its strange because I never remember walking into a large library and being tracked every where I went and accosted by librarians trying to "recommend" books to me. I remember searching the catalog without any interference. I rememeber going to the stacks to retrieve books and then discovering other books nearby because the stacks were organised according to a public standard, not according to a secret algorithm. I remember checking out books so I could read them outside the library, wherever I want to read them, in private. (I download YouTube videos as opposed to using Google's Javascript "video player".)
This blog author wants readers to believe going to a library and finding books is "hard". I dont remember it that way. Not every patron who visits a library asks a librarian for assistance. YouTube doesnt give anyone a choice to be left alone. Theres a "librarian" looking over your shoulder the entire time. The amount of telemetry and tracking while using the website is totally unnecessary.
They have at least have "not interested" but it's a clunky system.
TikTok is having more luck because they're getting this data by watching WHEN you stop watching a video. Video's automatically play, so the default signal when you swipe past something is "no".
I don't think personal recommendations are a good use of developer effort, I think I'd be fine if it just showed me videos liked by other people who have liked the same videos I have.
This is a weird way to say "ideas that we suppress for being misinformation are sometimes actually true."
Using uBlock Origin, right-click the part of the page you want removed, select the "remove element" option, confirm it's got the right bits selected, and nuke.
Using Stylus, you'll be opening up the element inspector and finding the relevant selectors. Fastest option is to apply a "display: none important!;" directive to that.
Stylus is by far the more powerful and flexible of the two, and you can do far more with it if you care to. uBlock should be sufficient though.
At the very least, YT should recognize that this metric is quite subjective, and given the vast scale of information out there, they simply cannot be the experts.
Sometimes I just dont want any magic, just give it to me.
If I like seeing 'flat earth' videos, then recommend the next best 'flat earth' video, just keep it simple?
My first thought reading this was about reinforcing a stereotype as the son learning hard sciences and the daughter watching fun shows about people things.
Did anyone else notice or think about that that?
But it is also telling his reality. It wouldn't be honest for him to say his daughter watches 3Blue1Brown if she doesn't, would it?
Also, what is the deal with mrbeast and other video always being on default? Did they pay for that exposure or what.
Kind of. Mr. Beast videos make Youtube tons of money. Shouldn't be a surprise that Youtube pushes them harder than videos that don't make a ton of money.
It was about how ancient aliens mined out the moon and control us or some other crazy pills kind of idea
My actual normal video recommendations are quite good and shift alongside my interests over time quite well without prolific use of "not interested" or incognito shrugs. Rented movies seems bad though, especially since I'm already a paying premium user!
Well it doesn't live up to the principle, not even close.
> It’s constantly evolving, learning every day from over 80 billion pieces of information we call signals.
This is probably why. It's a positive feedback loop mechanism for noise.
Another reason why is probably that it's metric for "give them value" is engagement. Its a dishonest metric, because "gives them value" is less important than "give us value", so even if you found that it doesnt give them value, you've got a perverse incentive to continue doing it and pretending it is in their best interest.
OK, that's a very simple example they are throwing out there. But there are many cases where nothing is really settled and Youtube will arbitrarily decide what is fake or not - even though they have absolutely no right to make such kind of judgment. "Authoritative News" can routinely mean "facts" and "hoaxes". It's not like we have never seen those in recent history...
FWIW: I pay for youtube music, which comes with youtube premium included so I don't get ads, but the recommendation system is pretty much useless.
Is earth flat? Of course not but should we punish people for thinking this? Our society says verbally no but its choices say otherwise, they try to do it through undemocratic processes (aka Youtube???). If we provide better education to these people or even a better Health System accessible to everyone we can have better results.
IMHO in the long run Youtube will just send these people to other more "hospitable" platforms and I have a feeling that it will help increase the number of flat earthers. So Youtube is in an interaction with its customers not only to maintain their echo chamber but possibly engineer the echoes. The decrease in variability of members is a way.
Do we need armadillos when we eat chicken?
I just came here to say why on earth YouTube keep showing me the same Casey Neistat video 1000 times even though in the past I have never chosen to click on it. Doesn't it make sense to just stop showing it then like after may be 10 times?
Also, what's up with showing the video that I have watched already over and over again?
I have just realized the Algorithm weighs quantity a lot over the quality, I have about 200 subscriptions containing multiple small time creators with niche content, but it is the same content creators again and again in my recommendations. I don't even see if any of my subscription posted a new video immediately.
NRI here, I watch a lot of programming content on youtube, but I have never watched a single programming video in hindi, but my entire recommendation is fill up with programming in hindi, can we just ask the algorithm to stop boxing user profiles into racialized tendencies.
So the system is relatively good starting from a clean slate for a short period of time when focused on a relatively narrow set of content. Other than that, I haven't found it to be very useful. Before doing this, I would rely on search and ignore most if not all of the recommendations that came my way.
I recommend using an extension called 'Improve YouTube!' [1], which allows to collapse/hide recommendations/comments/ads/related videos/autoplay etc.
There's also YouTube Vanced [2], which allows some of these things on Android.
1. https://chrome.google.com/webstore/detail/improve-youtube-vi...
On some level people love to click on things they expect to hate and downvote it. I suspect that increases the odds of showing them similar content, vs. say, spotify which explicitly says "sorry we will try to avoid things like this". Youtube should make it clear which strategy they're doing, and they should do the latter.
The only thing you need to read from that post
I'm at a point where I don't want better recommendations to tell me what to think. I'm already having enough of a challenge keeping my focus with the status quo.
Basically, all recommendation system are based on one hypothesis: I know more about your preference than yourself. I can't say I agree with it.
Once, I've clicked on a video about Tetris' World Championship Finals (I was curious!), and I was plagued with Tetris recommendations for some months.
On the other hand, if they let youtube go without any ’fringe control’ the society might collapse into tribal war.
All the previous information distribution channels had some form of control.
Perhaps this is the reason the recommendations are junk? How can Youtube recommend you videos if you hide what you like from Youtube?
I watch a lot of Youtube, signed in, on Youtube.com, and my recommendations are great.
I wish I could organize my youtube subscriptions better. Right now everything lands in a flat feed of videos and all the stuff I feel like watching now is buried deep within 50 music uploads.
I got myself into this situation by subscribing to hundreds of channels but I also don't want to pick favourites and unsubscribe from the rest, simply being able to put channels into categories would be perfect.
Unfortunately, it's been all but entirely crippled through API limits.
When it did work, what it allowed was:
- A terminal text-only interface.
- The ability to search by keyword or publisher.
- The ability to restrict to music, or all videos.
- The ability to set playback preferences (audio only, video, preferred resolution)
- The ability to create current or permanently saved playlists. These could be edited down from the search results.
- Runs on Linux, MacOS, Android (under Termux via pip), and Windows (Cygwin or WSL). On Android, it permits backgrounding video or listening to audio only.
- Accrues no user history.
- Recommendations-free.
And the killer feature:
- The ability to play through a curated playlist from beginning to end.
(It also suppresses advertising, which is a plus in my book.)
You could also download directly from the utility (audio, video, or both). All state are retained locally, there's no need to have or use an account.
For topics of interest, I would stack up 5, or 10 or 20, or 40 videos, and roll through them over an hour, day, or week. It was really marvelous.
As noted, YouTube killed it through their application API control.
mpv offers the ability to play back either a single video, a (separately maintained) playlist, or a channel or YouTube playlist, and is my usual go-to. It's also useful on other sites (mpv uses the youtube-dl utility which supports a whole slew of video and audio sites and platforms).
Absent mpsyt, what I'll frequently do is use YouTube's site search or a third party (e.g., DDG video search) to grab a bunch of video URLs, and either play those from the commandline (mpv will take multiple arguments) or as a saved file (useful for restarting later). This gives an ad-free experience as well.
But for YouTube itself, mpsyt was amazing.
1. Have a specific video in mind. It mostly comes to me from memory of recommendations past. For example, "Hey I remember Crash Course Film History had an episode on 2001: A Space Odyssey."
2. Now the fun part: NEVER use search to find the video (external search engines too!) directly or indirectly. Rather try to bias the recommendation algorithm until it recommends that video to you again. You can only try to influence the algorithm by watching other recommended videos, liking, or disliking. Recommendations come from the homepage or from the sidebar when you watch other videos; this means you can't start your search by visiting their channel
In this example, maybe I'll try watching more content from Crash Course. I've been watching STEM Crash Course lately so maybe I'll click on the first Crash Course Humanities content that gets recommended. To bias towards movies maybe I'll watch myself some Nerdwriter. Again, note that searching for Crash Course Humanities content is considered cheating. It has to be recommended to you.
It's a race against yourself to get to the video you want.
Obviously, I don't have anything better to do with my time ;). Once I was looking for a video I was extremely sure Tom Scott made. After what's basically a nonsequential brute force search through his channel (again, only through recommendations!), I still couldn't find it. I gave up but I still didn't search, rather I emailed Tom. He replied only to say he's never made such video. This is when I used search; it turned out to be from Vox.
I just wanted to "stick it to Google": if they are going to profile me and my interests, I won't make it too easy. With this game, I'm hoping my interests appear broader than they are to Google.
Have fun.
You can block channels pretty easily and never see content from them again.
https://static.googleusercontent.com/media/guidelines.raterh...
https://icannwiki.org/Brand_TLD
It's pretty strange to me that ICANN allowed the "brand TLD" concept.
I remember being impressed that the Aga Khan has his own TLD under this program:
They got it by paying money and meeting the other new-gTLD compliance requirements during the new gTLD gold rush.
None of those requirements involved current use or a forward commitment to heavy use, so the “while barely using it” is irrelevant.
Feels like it's not really trying.
I want to be able to select exploration/discovery vs sticking with familiar grounds.
It boggles my mind that there are zero explicit user settings and the only option is somehow game the recommendation system.
If you ask me, it's the same story with just about every single digital service; recommendations are never perfect, merely useful. What's more, the stakes are just so low, literally spare time. The level of vitriol feels, to me, disproportionate -- crowding out more interesting discussions.
> As I’ve explained, we actively demote low-quality information in recommendations. But we also take the additional step of showing viewers authoritative videos about topics that may interest them. Say I watch a video about the COVID-19 vaccine. In my Up Next panel, I’ll see videos from reputable sources like Vox and Bloomberg Quicktake and won’t see videos that contain misleading information about vaccines (to the extent that our system can detect them).
What is Google/YouTube's full list of "reputable sources". Is it even reasonable to maintain such a list given how often sources treated as authoritative can be wrong? Doesn't this amount to artificially converging users' information exposure towards either the reigning authority or towards whatever Google/YouTube favor?
Generalizing my concerns further, I am not comfortable with having a narrow set of monopolistic tech giants serve as king maker in our information landscape. Perhaps what we need first and foremost is a renewed set of anti-trust laws to break up companies with giant market caps and to address the reduced competition faced by widely used platforms built on network effects.
what a sentence
> As I’ve explained, we actively demote low-quality information in recommendations. But we also take the additional step of showing viewers authoritative videos about topics that may interest them. Say I watch a video about the COVID-19 vaccine. In my Up Next panel, I’ll see videos from reputable sources like Vox and Bloomberg Quicktake and won’t see videos that contain misleading information about vaccines (to the extent that our system can detect them).
When my parents watch 30 consecutive videos that mention that you can't trust the mainstream media, what good is suggesting to them a Bloomberg video?
My parents aren't on FB or any other social media. But over the last 10 years they have gone off the deep end after watching more and more insane videos on Youtube. It started with hours of Ron Paul videos. Then Alex Jones videos. Who knows what cretins it offered up after that.
They've stopped talking to me recently. It's because I've gotten the "Kill Shot" after they've explicitly told me not to. I'm apparently going to die from the covid vaccine within the next 6 months to 3 years... On my last phone call, they've told me that 300 old money families (who are all also insane environmentalists) have hatched and are executing a plan to depopulate the planet to 500 million... as told on the "Georgia Guide Stones".
The US Government also has mind-control devices that are convincing people to get the shot. They told me the mind control devices were first used during the Iraq War, because it convinced so much of the Iraqi Army to surrender en masse.
Also, all of our technology comes from the wreckage of crashed UFOs, most famously the Roswell UFO. My Dad thinks I'm naive for not believing this.
Both my parents have college degrees. My Dad has a masters in an engineering field. They were once relatively un-political and rode out the Great Recession without any harm. Maybe they just didn't handle retirement well. But I have a decade of nearly daily emails from them... "Watch this! [youtube link]". When they are gone, I'll have documentation of their decent into madness, with hyperlinks all going to 1 domain. I'm sure YouTube is doing all they can.
I've watched Alex Jones videos myself. I did not leave wanting more and no recommendation algo could change that. (Lest I leave the wrong impression, the same is true of AOC going live on IG.)
I wonder who that article is addressed to? At first it may seem they want to reassure viewers that they will get shown the best content. But the longer the article goes on, the more it sounds like a reassurance that they will absolutely block out unwanted political views.
Why do they feel the need to put out such an article? To stave of government regulation? Or for their own political activism? To reassure the leftist mob that they won't be part guilty of electing another Trump?
I mean as a user, I'd like reassurance that the algorithm tries to show me videos I want to see, not videos that others want me to see.
I am not afraid of being shown a flat-earther video, so I don't get the appeal of the promise to not show me certain things.
Thanks Algorithm™
Basically, when you've accidentally created a white nationalist movement, you tend to ratchet back your ambitions.