Facebook needs to literally grow up, removing porn is one thing but clinical images, mothers feeding their children and such should not even be up for discussion.
It's a fine line between protecting your users from seeing offensive content and outright censorship, good to see them doing the right thing in this case, pity it is still on a 'case-by-case' basis instead of a healthy review of their policies.
The main criterion seems to be 'is the internet raising a large enough stink'? If yes then restore the image.
The Vietnam photo censorship was met with quite a bit of confusion since I think most folks misremember it being our high school history book.
I'm starting to think that one group who keeps calling the FCC over TV shows is alive and well online.
That said, I'm not convinced that sexuality on its own is damaging either. I'm not sure if you implied that with the second half of your statement/the nudity vs sexual distinction.
We need to stop calling naked female breasts "porn".
For the simple reason that, well ... they're fucking not.
How the fuck can the naked humam body be offensive. Since it's a purely historico-religious ideology, it equates to "God made a mistake".
What nonsense.
That's the same criterion for taking things down though.
In the worst case, it's their NSFW algorithms identifying a nipple in the picture, with no regard for context.
This way, even if they mis-categorize a medical image it won't be banned, just not widely disseminated.
They reviewed their policies over a year ago and images of breasts are allowed if they're breastfeeding, or after mastectomy (either reconstructed or tattooed or just left) or raising awareness.
http://www.huffingtonpost.co.uk/2015/03/16/breastfeeding-fac...
https://www.facebook.com/help/340974655932193
https://www.facebook.com/help/318612434939348?helpref=search
What you see here are people reporting images, and facebook algos / employees mistakenly banning.
Just look at this video, and see how ridiculous it is that Facebook had an automated tool that would take that down:
But people are offended by seeing clinical images of nudity, mother feeding their babies, etc. And Facebook wants to placate all of their users so nobody has a reason to leave.
Of course, they can't please everyone, but they'd rather make a fool of themselves before they'd become a more morally/socially opinionated corporation like Starbucks.
"Hide this post"
I'll take honest answers and snarky ones as well, provided the snarky ones come with a genuine laugh because with this DDoS going on this morning I could REALLY use a chuckle right now..
I've yet to meet someone like that. Are you offended by any of this? Is anybody else on HN offended by such images?
Besides that, we're not talking about porn. Unless you want to see breast cancer videos and mothers feeding their children as porn but that would be a perception error, the label simply does not apply.
This isn't about what they can or can't do, it's about what they should or shouldn't do.
The benefits of breast cancer awareness information outweigh any imaginable harm. If you don't want to see it, then you certainly don't have to.
Why wouldn't I? You assume I'm some sort of prude?
They'd fit right in with everything else in my feed that my friends post and I don't care about.
Why can't people let Facebook know what should and shouldn't be filtered?
Facebook already have policy that breasts are ok when they're raising awareness of breast cancer or being used to feed children, so facebook is ignoring their own existing policy.
I mean, every time you see a male dog there's a penis getting about with it, we don't seem to mind those.
Probably because dogs haven't had the Church tell them their bodies are shameful for a thousand years.
I think that FB is inside a fear bubble of its own making. They have to apologize too much lately and this hints to something wrong in their vetting process. Maybe too much imperfect AI or too much imperfect human judgements.
I wonder if seeing breast feeding triggers a fundamental paradox in us humans. Seeing as how we're the only primates who have boobs even when not breast feeding. Originally boobs meant that this femal is not available for sex. In humans it means that she is very available BUT ALSO that she isn't.
This is confusing and this conflict causes discomfort.
So you would be "uncomfortable" having a female coworker that was using a breast pump? No wonder this industry has such a gender problem!
It appears that Facebook never argued that the image was in breach of its policies. Just that some software it runs had a bug that miss classified this image.
Then when challenged they apologized and approved the add.
So to me the summary appears to be "Software company has bug that effected one customer, apologies and fixes the issue" which must happen every hour of every day...
Am I missing something?....
I'm really tired of people engaging in pointless signaling campaigns and expecting to get points for being So Brave in the face of ~ universal consensus that they are correct, or taking minor bureaucratic snafus like this as evidence that they are somehow not in a position of complete victory.
Facebook is trying to automate the detection of illegal/unwanted images, and it seems extremely difficult to detect the context of the image to the extent that you can differentiate between acceptable images of human bodies, and unacceptable images (which would be, I assume, the vast, vast majority of such images posted).
I wonder how they could proceed with this- maybe with some sort of anomaly detection, where you do a first pass to detect all images containing the unwanted features (e.g. naked bodies), and then a second pass to try and detect the activity that's going on, or to detect if the image is famous (e.g. a picture of David, the famous Italian statue, would be acceptable, while a photo of a naked man in the same position would presumably not be).
[1] http://www.siliconbeat.com/2016/09/12/sheryl-sandberg-respon...
But it bothers me that we leave so much of our discourse to such imperfect systems.
I'm not aware of Facebook suppressing the Prager University videos.
And in the YouTube case, it's likely an automated response to (mis)flagging by users who politically differ.
Have one single clear principle and apply is consistently. Change the principle don't make exceptions if needed. "Educational videos wont be removed" could have been a good policy that Google has had for Youtube.
Or even "No Breasts"can be a good policy too. If you want to show breast cancer videos do it on you tube, shoot it with a prop or link to another page. I dont see why that does not work.