Some of filtering is based on what the user wants to see, some of it is based on some notion of how "good" a piece of content is (scored by likes and engagement numbers), some of it is from advertisers paying to have their content make it through the filter, and some of it is Facebook deciding what should be seen and what shouldn't (mostly driven by their desire to keep you on the platform). Every single thing you see on Facebook has made it through a huge filter that ultimately decides if it's something you should see or not. And the inevitable outcome of building a gigantic what-information-do-you-get-to-see machine is that there are many, many parties trying to influence the machine.
Phone lines don't have that problem.
If Facebook limits the filtering to engagement, then it isn't the fault of Facebook that political content is engaging. That's just human nature. Disasters, outrage, politics, polarizing topics - these are all popular topics both online and off-line, and apread quickly as town gossip well before Facebook.
It is only when Facebook steps in and says that particular topics need to be exceptions to the filtering rules that apply to everything else, that they make themselves into a political actor.
For instance, let's say that the news feed showed you content based purely on number of likes. If political posts get lots of likes that isn't Facebook's problem. If the same ranking rules apply to all posts (# of likes) then they would remain neutral. As soon as Facebook says "content from x person will have their ranking artificially changed to reduce/increase engagement with it" thereby making an exception to the rule that applies to everything else, they have now become a political actor.
If I build a bridge intending it to stay up and it happens to fall down 6 months later, I'm responsible for it. Facebook created an algorithm that divides people politically and that surfaces content that is provably fictional. So they should be held responsible for it regardless of their intent. They don't get to invoke "common carrier" status when they're writing software that makes decisions about what you do or don't see. What makes a telephone a "common carrier" is the fact that the telephone doesn't decide who you call.
It doesn't matter whether it's software or a human. What matters is that decisions are being made by Facebook about what you do or don't see.
Whether or not it is intentional is immaterial to the effect. The law doesn't care about your intent. I wouldn't intentionally dump toxic waste into a river but I'm liable for dumping whether I intended to or not. Mark Zuckerberg can't just throw up his hands and go "oops it's software I can't help it" when it's his company that made all of the decisions about how the software works.
This isn't correct. The law in most modern democracies, as far as I'm aware, is very concerned with intent.
This why we generally define murder and manslaughter as distinct.
Murder is the unlawful killing of another human without justification or valid excuse, especially the unlawful killing of another human with malice aforethought.
https://en.wikipedia.org/wiki/Murder
Manslaughter is a common law legal term for homicide considered by law as less culpable than murder
https://en.wikipedia.org/wiki/Manslaughter
Murder vs manslaughter is the extreme example, though you'll find courts are broadly quite concerned with intent.
The information is out there. There are reliable news sources. There are reliable databases and encyclopaedias and journalism. If people choose not to read them then that's on them.
But... I think what we're seeing with political content is just a symptom of the real problem.
> Disasters, outrage, politics, polarizing topics - these are all popular topics both online and off-line, and spread quickly as town gossip well before Facebook.
This is true. But when information spreads through people's conversations with each other there's limits to how fast it spreads. There's also a lot of room for dialogue and different perspectives. If I have some silly conspiracy theory that I want to spread around, it's going to be pretty hard to convince the people around me that 5G is going to activate microchips that were injected into my bloodstream. They will likely point out that basic laws of physics don't really allow for that. But if I know how to game a social media algorithm[0] to connect me with millions of people that are susceptible to that kind of thinking, I could convince a shockingly huge number of them to believe it[1]. Especially if the social media platform isolates those people from opposing opinions and connects them with people that think similarly.
I think social media is like removing the control rods from a reactor. Those basic human flaws are now being amplified and capitalized on at a scale we can barely even grasp. And it really doesn't matter if Facebook, Twitter, etc. are "at fault" or not. It's a fundamental problem with this services and the problems will continue to get worse.
0. https://www.npr.org/2020/07/10/889037310/anatomy-of-a-covid-... 1. https://www.cnn.com/2020/04/13/us/coronavirus-made-in-lab-po...
Does any site actually do this successfully? It seems to me that even sites that lean heavily towards algorithmic curation (including HN) still have an element of human veto.
being political is not an incidental facet of facebook, it's a core intention.
2. Does it matter if it’s Facebooks “fault” or not? The issue is their power.
Imho ideally they would acknowledge and accept responsibility for their power and in the US at least there would also be some laws regulating them in this regard.
This is like saying discrimination that gets baked into an ML model isn't the creators' fault imo.
And the inevitable outcome of building a what-calls-go-through machine is that there are many parties trying to influence the machine. Eg. faking caller ID, evading blocks with throwaway numbers, spamming no-response calls to figure out which numbers are valid to target, faking a robot voice to pretend to be a real person.
Practically every modern platform uses centralized systems to filter the noisy world down to something fit for purpose, and sometimes this intersects with political issues. That's no reason to expect a platform like Facebook to become even more political in their stance than the existing level of politicization that is almost impossible to avoid.