YouTube shows you more of what you look for. If it stops doing that, people will use a 3rd-party search tool that lawmakers will be unable to censor (e.g. Reddit, Digg, 9gag, 4chan).
It's a scapegoat. If you want people to stop looking up content about distrusting the government, the solution is a more transparent government, not to stop them from watching the content.
It doesn't just do that, it pushes you more and more into specific niches. It doesn't have to do that either, but Youtube has designed it to do it, to put users into the famous Youtube Rabbit Hole.
This rabbit hole encourages extremism in some people, and that can be harmful to society. Why the defeatist attitude, that huge companies have to encourage extremism or people will stop using their products? That's non-sensical
Youtube recommends videos you are likely to click on. It is unclear to me what else they are supposed to recommend.
It is also unclear to me that the recommendations ought to be moderated by some embedded state entity to make sure the results align with what is in political vogue.
I sense there are two processes at work here. One is simply an algorithm that recommends more of the same. If I watch a Jordan Peterson video, I start seeing recommendations for his other lectures and podcasts. I doubt YouTube's algorithm is specifically trying to change my views politically. Likewise, if I end up seeing a lefty BreadTuber's video, I start to see more socialist videos.
The other thing is that we really don't have shared definition or word-feel for "extremism". The people who are the most vocal about curbing extremist content also happen to be wildly partisan. I don't think that is a coincidence. I sense the false-positive rate is extremely high in this regard.