Facebook would be a much better platform if they removed the like and comment count from every post. It’s those numbers that lead people to believe that garbage is legitimate in the first place. By hiding these numbers it would force people to think for themselves.
Same thing with YouTube. Fake News becomes real news when it had enough views, likes, upvotes, comments, etc.. These counts don’t really need to be shown to the end user.
Should we censor social media, because people are too stupid to deal with it? I don't think so and if you do, you're playing in the hands of corporations who want a cheap work force.
Your "solution" isn't a solution, but a hot fix that will cause more damage in the future. It's exactly the shortsighted crap politics are trying to do right now to get immediate results.
It was a conference that was organized by a organization that holds one of the major initiative against false news, and where one of the main tracks was about addressing the issue of false news. Last year they had a massive booth at the middle of the conference hall about false news. This year they brought in a key note speaker which brought several political slides while making misleading conclusions that was later proven to be false, but the whole thing echoed with the audience. It seemed to not matter a bit that it also just happened to be false.
When facts get in the way of a political message, even the most educated seems to let the facts go if its align with their own views. I wish education would work, but I don't see it as a proven method to solve echo chambers or political news.
But how do educate yourself though ?
The same argument was already made about books or newspapers: depending of the subject a lot of them are trash. Same for TV news. Wikipedia is also not a defacto trustable source.
Except for first hand experience on a subject, you will never be able to find a source that is inherently trustable (and you might even interpret your experience the wrong way, so you should doubt yourself too...).
To educate yourself, there's a point where you have to try to trust some of the information you get. Having that info come from a trusted source or with a high level of peer review is a useful first level of filtering. From there you can (and surely should) invest in more time to double check or cross check if you want, but IMO quick filtering is necessary, and can't be brushed away saying "people should know better".
I believe this is a design issue (systems design, not UI design). Filter bubbles and confirmation bias can reinforce each other in a vicious circle. Financial incentives for "news outlets" to mislead people make it worse. That's the heart of facebooks issue.
With a herculan effort we might be able to overcome that with education, but improving Facebook's design to lessen the impact of echo chambers and social confirmation would be much easier and faster. Removing all interaction is one way to do that, though I agree that that's a bit heavy handed and goes against Facebook's incentives.
I agree, but it's impossible to educate everyone on any subject. I think in this case, people who are experts in the matter, should give a reputation rating on the article. That would be really great. For example, I am no expert in theoretical physics, but I may enjoy an article about it, but how do I know it's not a bunch of bullshit I am reading? If I see that Stephen Hawking gave it a 5 star reputable source rating, I "know" I can believe it.
Just as a sort of general, broad response to this sentiment: yes absolutely. I envy your optimism, but the belief amongst the Hacker News set that humans are universally capable of educating their way out from under deeply-rooted biological faults is misguided. You're not going to educate people out of conferring status on the most popular people/stories/posts/etc. You're competing against at least a few hundred thousand years of evolution (and many of these processes go back much, much further than that.) You won't win.
So, yes, absolutely we should be working on ways to acknowledge and route around human foibles, not pretending we can eliminate them.
But https://www.npr.org/sections/money/2017/08/25/546127444/epis... convinced me that this is a case where education is NOT enough and there's a time where the only solution is plain censorship, unfortunately.
The censorship doesn't have to be confidential. It can be auditable and publicly recorded. But in that case in Ukraine, they did all the education stuff and let the fake news through, and that wasn't enough to stop the country from being ripped apart.
You will never educate the masses and even if you did, some of what you taught them would be wrong.
However, it seems possible to get further along than we are now by creating betting pools/markets where the participants do agree on a set of sources for information.
Actually, I've thought along these lines before rather indirectly by theorizing that the WSJ was probably the most non-partisan source of "news" because most of it's clientele demanded such due to their participation in the markets. But I realize that's highly speculative.
You might like the extension Facebook Demetricator: http://bengrosser.com/projects/facebook-demetricator/
Personally I block the entire news feed with uBlock Origin which has made my experience of Facebook tolerable.
Social media is a game and the score is whatever metric is shown. You want to most points for yourself and you only want to participate in popular content in the hopes it will earn you more points.
I am seriously considering a year without adblock and going out of my way to click as many ads as possible. Blocking is ineffective, it's time to fuck over the metrics
News itself is an echo chamber. Don't see how that is facebook's fault.
The reason news is toxic isn't facebook. It's foxnews, nytimes, wapo and the entire media industry.
Not sure why you think "like" buttons would change that.
The news echo chamber existed before facebook. It existed for zuckerburg was born. It'll exist long after facebook is gone.
The only solution is for facebook to ban news entirely from facebook. But then the news industry would get very upset.
My Facebook feed is mostly personal updates about hundreds of people I have met, and it feels like the most democratic media I’ve ever seen. It improved a lot when I unfollowed the handful of people who post junk like sportsball.
Are you assuming the faceless masses are exposed to some unimaginable horror that you aren’t seeing? Or tell us more about the horrors you do see.
The Democrats thought the same thing too, before Trump won the election. Perhaps you are falling victim to a filter bubble confirmation bias?
A democracy implies independent national sovereignty. When anyone (govt, corp, or rogue hackers) across the world can effectively manipulate your elections at scale, sovereign democracy is no longer a thing.
I wonder if Reddit has as much fake news as Facebook?
Remove the counts and people have no external validation for the legitimacy of a piece of content. They actually have to read it. What we have now is a system where someone sees a headline, “Proof Obama birth certificate fake!” With 100k like and 10k comments and they don’t even read it and just comment, “I knew it!”
When you pick up a newspaper you have to actually read it. You can’t just look at the headline and see how many people agree with it before you can form any kind of opinion.
For almost all stories you can work out its popularity just by the number of comments. And you can determine the comment count (-/+ an order of magnitude) just by scrolling through the comments. The accurate number isn't important for most people.
None of this will prevent the dissemination of fake news. What will do is (a) not prioritising it and (b) detecting whether it's fake. Facebook just added solutions to do both.
People comment on stories with the most comments and usually reply to the first set of posts because they want their post to be seen, accelerating commenting in that thread.
If you want to count comments fine. Most people won’t. FB actually makes this hard to do.
Would be fascinating to even try it here on HN. If you hit all the integer counters it would dramatically change the way people use the site.
you can work out its popularity just by the number of comments
Popularity, yes. Accuracy, no.Or controversy.
Instead of showing people only content that they agree with, with counts that reinforce the legitimacy of that content, you would show people everything that is popular and let them determine for themselves what they think about it.
I’m not talking about removing the comments. I’m saying don’t show the count. If people want to read the comments they can.
1. It's Facebook, not the government 2. They aren't banning anything
How is this a free speech issue?
At least, that's how this news reads from my admittedly-cynical perspective.
It's a no win situation at this point. I'm not sure there's any solution short of removing news from the platform.
This feature is in fact guaranteed to cost far more than a few experts ever would.
But Facebook knows that any expert panel will without a sliver of doubt quickly converge to a ranking that has the Economist and the New York Times in top positions, and Breitbart somewhere behind a random word generator.
Any working statistical method will lead to the same result, obviously. But it gives Facebook the option to invoke HN's favourite argument: "it can't be political because it's the algorithm. See: here are numbers."[0].
[0]: Compare, for example, the libertarian love for bitcoin, and how it's free from the politics that undermine Central Banks,
I read articles in the New York Times, the Economist and also Breitbart (UK edition) fairly regularly. Whilst they cover very different stories as you'd expect given their political biases, I have not noticed any major difference in accuracy when dealing with objective facts. This is partly because "mainstream" media is quite unreliable, rather than any awesome quality of reliability inherent to Breitbart, but the idea that it's unreliable seems to be to be coming from people who simply dislike conservative worldviews ... and desperately want to stop people from reading them. Same reason they try and smear anyone who goes looking as nazis, bigots etc. They fear that if someone reads things from the "other side" they might find it's not so unreasonable after all ...
They don't want to alienate the conservative crowd who basically check in to Facebook ONLY to see the day's new anti-Hillary meme.
I don't know about you guys, but my Grandma clicks on ads at an almost bot like rate.
if there's no particular overlord that people can point to then they can't be as outraged at it.
Is this truly ignorance or is it malice?
They legally, morally and ethically have the right to determine what content their users see. Especially given that we have demonstrated evidence of fake news being disseminated.
The idea that this was supposed to be a democratic process was never claimed nor should it have been expected.
Expect mass brigading by The_Donald.
1. Told their viewers that it was illegal to read the DNC emails leaked to Wikileaks themselves, and to just listen to what CNN has to say about them.
2. Got caught posing one of their cameramen as a protester/rioter for an interview about why he was "in the streets".
3. Deceptively edited a video of Trump dumping fish food into a koi pond to omit the fact that he was following the Japanese PM's lead, then spent a whole day talking about how disrespectful it was.
4. Spent a whole morning talking about how Trump drinks diet coke, while there was an active terrorist attack in NY.
That's just what I pulled off the top of my head, too.
Edit: to the user that downvoted my comment: which of the four bullet points was false? It's easy to protect the liars who benefit our goals. It takes intellectual honesty to criticize one's allies.
[0] https://medium.com/@evanpopp/google-changes-its-algorithm-to...
It is a randomised sample of their user base. It will not be possible for any group to gain an advantage.
Expect mass brigading by /r/politics.
Forgive me but if the Powers That Be were breathing down my neck 24/7 I think I might start to believe I'm expected to "do something."
Facebook isn't the arbiter. Their users are.
And if a substantial percentage deem a website to be untrustworthy then Facebook simply won't display it as often.
That all said, there is a redeeming possibility here. You could also do an in-house monitored, qualified peer review of select articles from popular sources and use those results to audit users. If somebody keeps hating on WaPo and pushing Fox you could make assumptions about the quality of their judgement.
Worthy of experimentation, but not to decide what's shown on my wall.
You probably can get there in the end, but it's probably easier to pay somebody to objectively rank to the political position, and journalistic integrity of major publications.
Its a bit tone-deaf, Mark. Remember the "we had no effect on the election" line that everyone laughed at? This year its "user curation is the answer to fake news". Everyone is laughing at you again.
If you want to determine credibility, hire people to judge credibility. Hire a mix of viewpoints. Its not that hard, it just costs money, which you have more than enough of. Maybe give some of that money back to the users who turn over their online lives to you and actually improve their lives, instead of just monetizing them by feeding users garbage that gets them to click and consume while at the same time you go around complaining that users aren't discriminating enough with those clicks and consumption.
Also, that people were using some sleazy disinformation on Facebook as a convenient excuse.
> "I think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news,” Zuckerberg said. “If you believe that, then I don’t think you internalized the message that Trump voters are trying to send in this election."
Wonder if they included the part where they (Twitter) went to RT, which is pretty much Moscow's propaganda arm, and offered to sell them $200k worth of adds geared specifically for US elections crowd.
See more stories from preferred sources, and block news sources you don’t like.
- Under "Preferred," list publications you want to see more news from.
- Under "Blocked," list publications you don’t want to see any news from.
IMHO anything on FB beyond connecting with people is worthless anyway and mostly driven by bots and not people these days.
But user ranking is basically FB saying "no, no. We don't do news, that's not our business. Sign waiver here"
Now it becomes a battle of which side can marshall more FB users to flag stories of the other side as fake. Which, in turn, results in more views and clicks for FB.
Furthermore, it opens the way for anyone with a botnet of infected machines to rig the credibility of any news outlet to his liking.
Facebook draws its revenue from personality profiling its users for targeted advertising. How much more profit do they make when they can actively push people towards binary thinking, and make them easy to categorize?
They could, for example, only count votes by long-term users. They also have almost complete knowledge of their users' political leanings, making it possible to counteract any attempts at manipulation.
"Here are some articles at the polar opposite to your norms."
Instead we show people what they want to see with a number next to it that reinforces the popularity and legitimizes whatever it is. Part of how birtherism became a thing. You only see articles from a source you trust and you see that 100k people also approve of ever post on the topic. “Hmmm. Well, those 100,000 people must be into something!”
No algorithm is (yet) smart enough to judge the quality of journalism. Facebook shouldn't try.
Snapchat's a tabloid, Facebook is basically a news RSS feed, and I'm even starting to get news "stories" when browsing hashtags on Instagram. I just want to discover people and talk to people. Apart from dating apps (Tinder/Bumble/etc.), it seems that all social networks have become news networks.
It could get worse in the future, as it's relatively small (1 million known active users) and has a positive founder effect, but for now it's very good.
Future Republican candidates can put out FB ads only target Orange County Voters who dislike CNN's news article on Trump.
Democrat party can put FB ads asking for campaign contributions only from folks in SF that like CNN's news article on Trump.
"It declined to say how many people were polled or which news outlets they were asked about."
Well, in a country where 30 states voted for Trump, 20 states voted for Hillary, and 100 million people chose not to vote (and that's 42% of the eligible electorate[0]), who shall be arbiter of what news, at least U.S. political news, "nearly noone disagrees with"? I mean, given that a majority of politically active people in a large majority of states voted for Trump, I expect that you'll agree with me that Alex Jones' endlessly pro-Trump daily videos should be featured right at the top of everyone's News Feed. I know I'd love that.
I don't use my Facebook much except to occasionally share vacation photos these days. Checking recently seems to show no news of any sort. Just some silly memes, weather phenomenon (supermoons!), pop-geek-culture stuff, and daily status updates. Since it's the season, there's some American football discussion, which people obviously have disagreements on, of course, but in that "sport" sort of way. :) I'm sure major news (like major weather events) will still make my feed, but there seems to be a noticeable decrease in political stuff.
My cynical take on this announcement is that, for those that post and share a lot of news, there is a high probability that your news bubble will be more heavily reinforced. Not all Trump voters are Alex Jones heads, of course, but if you are one, maybe Alex Jones will still be at the top of your news feed, and you will get all of the Alex Jones-y shares you want. It just won't be on the feed of the friend who prefers is apolitical merely sharing silly cat videos, or the person who is sharing Rachel Maddow clips. Facebook, in other words, will strive to bring you the best news feed that you will never disagree with, all the better for dopamine-inducing clicks and likes.
(I don't really know, of course, if my cynicism is off the mark, but these sort of opaque-algorithm news feed games Facebook plays are a large reason why I rarely use my Facebook feed. :) )
But I still believe that Facebook could whitelist news sources too - but obviously if they do there will be a massive cry from those that thought breitbart or drudge was going to pass that cut.
Facebook users will be in even greater echo-chamber re-assuring their world-views therefore they will be more engaged with the website, and this is all FB cares about.
My feed is basically a bullshit mill of news instead of actual "freinds" experiences. Even given the bullshit, a good % of the experience stuff is basically just selfie show-offs. It's almost become a mirror of the self, with very little socially valuable interaction.
oh you're on a beach ...like
oh you want to signal your virtue ..share
Facebook is starting to become the anti-social platform in the sense that it has an overall negative social effect.
I'd argue Rome died not in 476 but in -121 when the Senate killed Gaius Gracchus and didn't realize they needed reform and create an actual political process that doesn't lead to smooth talkers and thus tyranny.
I believe Plato's and the Founding Fathers' preferred solution here would be for each individual to find the smartest willing person they know and let them offer their analysis for consideration.
Suddenly they're going to be able to buy credibility the way they buy likes?
I'm not sure I'm a fan of this. I hope they've gamed that out.
That said, here's a story. One day during the presidential campaign I saw a person who normally really despised Donald Trump say "Hey I hate him but I think he kinda made sense on his speech today". Remember she watched the speech because she hated him and was looking for every chance to make fun of him but instead came off rethinking things for just that speech.
Then the next day she watched a mainstream TV news anchors go through the speech, one by one line and criticizing, using Trump's past history and etc. Then she said "I knew it he's an evil, he almost made me listen to him, so devious".
I normally never watch TV so observing someone else reacting completely opposite ways based on just a 30 minute TV show is when I realized how mainstream TV is shaping people's minds and propaganda in 2018 is stronger than ever. Most people won't want to believe that they are under heavy influence of political propaganda because their identity is connected to it, but that is the truth.
Just to be clear, this is not even about politics. This phenomenon is increasingly common across every aspect of the society. For example, if you look at what's happening in the cryptocurrency ecosystem, it's full of propaganda by people who have more knowledge manipulating people with less. The only way to overcome this is to:
1. Try to be as emotionally detached from the events as possible
2. Actually try to learn what is going on, instead of listening to what everyone else is saying, because even your most trustworthy friend, family, or even a very reputable nobel prize winner is under this influence unless they followed this principle, which most people don't have time to engage in.
3. If you ARE that reputable person, be careful what you tell your followers. If you haven't gone through step 1 and 2 and just saying "This is who I am, I'm just expressing myself, take it or leave it, just unfollow me if you don't want to hear what I say", you are being extremely irresponsible. You are basically taking your own gullibility and amplifying it to hundreds of thousands or millions of other people who are probably in less fortunate position that you are (which means they will suffer exponentially more from this misinformation than yourself)
Facebook doing this won't help fake news, it will only accelerate what they are already guilty of, because most people aren't even aware they are misinformed and furthermore don't want to believe they are wrong. So it will only result in larger and larger filter bubble which separates people even more.
It's like religion, when was the last time you were able to convince a religious person into believing that God doesn't exist? Imagine telling people on Facebook to vote whether God exists or not on Facebook.
Personally I think there's a huge opportunity hiding in this madness somewhere if you're an entrepreneur.
> For instance, Buzzfeed's initial beginnings as a viral site would have almost certainly hindered its growth into a serious news organisation had it been subject to the ideas about to be put in place by Mr Zuckerberg's team.
I'm not really sure what the answer to this is - but they are really good now and would definitely have been perma-black-holed before now by many measures I can think of.
http://www.journalism.org/2014/10/21/political-polarization-...