> This is not what DDG are doing though. They are explicitly inserting their bias
Well... yeah. How did you think instant answers worked in DuckDuckGo? They were coming from a list of curated sites.
A search engine is not accidentally biased, it is biased by design. The entire point is that DDG has a bias towards a set of criteria it uses to decide what does and doesn't count as a relevant search result. The whole point of a search engine is to be biased.
This is like arguing that Linux upstream repositories are biased because they curate which software goes into the Arch package list and decide for users what is and isn't malware. Yeah, of course they do that, that is literally their job -- literally the entire reason why those repositories exist is to filter out bad software. The point of curation/filtering/searching is to have someone else eliminate a bunch of irrelevant information and to rank information based on a set of rules about what is and isn't relevant/useful.
Likewise, the job of DDG is to filter results based on their opinions about what results are good and bad. You can disagree with their criteria or argue that they're filtering too aggressively, but the idea that they're not operating in good faith -- it's absurd, this is why they exist. We want a search engine that brings up accurate/useful information when we search for a topic.
It is astonishing to me that so many people on HN don't understand this, that they apparently never sat down and actually thought about what the phrase "search engine rankings" actually imply about what the search engine is doing behind the scenes.
----
> That they were even able to implement some sort explicit bias functionality so quickly is also a concern to me.
Kind of a sidenote, but this is also a good example of how bad people's instincts are about how algorithms work. People assume that an algorithm is neutral and that manually intervening is bias -- but algorithms are just the result of individual decisions about what criteria to prioritize and how to weight that criteria. A result isn't more or less biased just because it was directly created rather than indirectly created.
It's troubling that so many people on HN of all places were apparently convinced that DDG was completely neutral on political content just because they assumed that there wasn't any kind of manual curation or adjustment going on -- and the manual part is what made them feel good about it. They assumed DDG was unbiased just because the biases were automated and a computer did them.
If DDG was able to tweak their algorithms to get rid of misinformation, that apparently wouldn't have counted as curation? That this attitude is so prevalent even on a technical site makes me feel pessimistic about the future of AI/algorithms in our everyday life, because it shows that people have a fundamental misunderstanding about what algorithms are.