> Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.
Will apple.cn be extending this to searches about "tank man" or a certain stuffed bear? Oh bother...
The obvious implication is that your searches are being reported to some unaccountable authority.
I have no doubt that this creates a chilling effect against public discussion about these practices.
If Google was doing its job, no search results would contain CP and they wouldn’t need to warn anyone. They probably shouldn’t warn anyone since all it does is alert people who might be looking for CP that they are being watched, and scares anyone who isn’t looking but their search terms have similar words.
The outcome is overall worse than if no alert popped up.
There is a very real bright line difference popping up a “please get help” message for self destructive behaviors and arbitrary censorship.
Source: have eating disorder and consumed thinspo and proana content in my teens.
(I can trigger my partners with 75% reliability if I just speak in a high pitched voice)
Have you seen an article that, say, criticizes prison system and needs to start with a reminder than killing people is wrong? Definitely a strange sight. This looks like Soviet era prefaces about the decisions of latest CPSU congress and some relevant opinions of comrade Brezhnev that were expected to be found in any decently sized publication, whether it was a material science textbook, or a paper on Babylonians. It doesn't matter what you think, just do the required dance.
This might seem as nitpicking, but that preemptive display of obedience is the very thing that allows the likes of Apple and its customers to use the pretext successfully.
Children aren't very tough. If they don't have nearby humans intervening to protect them they tend to die or do badly. Evolution favours people who have strong instincts to protect and promote children. It is plausible.
Just one headline "bruce343434 was found with CP on their phone!" is there for life... child-stuff and rape are one of those two things, that never seem to vanish, and just a mere accusation destroys lives, even if later proved to be untrue.
There is a widespread belief that proper technical solutions are enough: give us end-to-end encryption and such, and everything will be all right. But people do things for a reason, and technical solutions are introduced accordingly. We need to look there to understand what's happening.
For an outside observer, the messaging media filter example does not even look convincing. So there's a Bad Guy chatting with a kid who can organize their meeting or take the conversation to a different, non-filtered service. And that's completely okay! (Unless, of course, there happens to be a need to promote Big Brother processing all conversations to protect the kids, heh.) However, when we mention sexual content being sent or received, there's a sudden wild flight of fantasy and countless dangers on the horizon. It makes one wonder whether the real goal was to protect no the kids but the parents, from the worried thoughts that their kid is not completely isolated from sexual sphere. The outcome is that today Beavis can't tell Butthead “Wow, look at those tits!” and send the picture without being reported. What a perfect repressive Victorian childhood, and what an outstanding member of society it will produce!
The taboo is twofold. One the one hand, it creates new positions of power for the people who enforce it legally and in the discourse who won't just dismantle themselves (quite the contrary, see the worldwide practice of drug prohibition and its effects on laws and bureaucratic growth). On the other hand, it creates the inflammatory excitement about the topic in the common person. Media knows well which stories — told from the correct angle, obviously — attract public. As a result, there is a stereotypical image of a maniac hiding in the shadows, and the need to “do something about it”. In fact, maniacs (also a stereotype formed by media, by the way) are rare: in 8-9 out of ten cases of child sexual abuse it's a person close to them who decides to “search for happiness” together in such a way.
If the father makes his daughter send him sexual photos, and he is also the one who gets notified about it, what is the point? Observe the observers, too? That's a bureaucratic dead end. Or is such system, the one to silently look into too numerous parental approvals, already in place? Then what about the father who gets notified about his daughter sending her boyfriend sexual photos while being completely okay with it? Well, maybe such father would disable the feature to stop feeling like a third wheel, but what would be the opinion of Big Brother? Effectively, the point of view of “the whole society” that you could previously silently ignore is transparently codified and enacted by the computer.
I'm pretty sure I have, actually. In the age of Twitter mobs that can have you fired over a misplaced comma, nothing should be left to chance. And protecting yourself from a misunderstanding or ambiguity isn't enough anymore, as a lot of the time these people are malicious. They will not only twist your words, but make things up entirely in order to make you look bad. And when (if) you get your 30 seconds to defend yourself, you'll want to have something short and unambiguous right at the top of the page to point at.
We have books being scanned that were written 15 years ago that now have to be rewritten. Publishers should never acknowledge these criticisms, this is a form of the worst censorship literature has to content with.
Of course ignoring everything isn't a solution as well, outrage can indeed be justified. But I think outrage has to surpass a level that summons more actions than a twitter post or change.org petition. If outrage still persists, maybe there is a problem you need to address.
Companies that comply too willingly would also do it for any regime. It is nothing that should be praised at all, especially not seen as "progressive" or "tolerant". It should be an example about how the free enterprise indeed fails to act responsible and champion any values. That would be a healthy perspective.
This cannot last much longer. A lot of political division in this country is due to technology. As someone genuinely interested in computation, it upsets me to realize this, but more and more it seems inevitable.
It is already game set match.
How were they historically? In person, where people gather.
> The masses have already been beaten into submission and thoroughly enslaved.
Is this simply because technology exists, or because it is monopolized? I think it's the latter, and it certainly projects a path out of this situation.
Or at my parish, which is becoming more and more like family.. people are fed up.
Can you imagine: everyone at the NSA is celebrating Apple and the CSAM automated scanning.
Simultaneously, connected intelligence officials: wait, they aren't going to let the nation state AI judge what I do for the STATE? Surely, no god-like AI would understand what is necessary. No xir.
Is the fact that we need this new word a sign that we've gone too far? It's 1984 already, isn't it??
Super-private, as in Supervised Private Data. :D
One of the central themes of 1984 was thought-crime and how one was trained to recognise and avoid it (without ever specifying the parameters of its definition!)
How can we trust this?
Apple designed the system so that their hashes are never known to client devices. Their server is fundamentally involved in checking your hashes against their list.
(edit: you can buy a deGoogled (but still Android) Fairphone from eFoundation: https://esolutions.shop/shop/e-os-fairphone-3-plus-fr/)
It is very public now that Apple will scan for such pictures, so how many pedophiles will keep them on their phones?
Sure, but when it’s mandated by law, what then? How difficult would it be to force insurrection related imagery to that hash database? Apple would be in a very tough spot if that were made into law.
They’ve already opened a can of iWorms, because now that they’ve shown it can be done, lawmakers in any country can mandate the same technology be added to every device and it doesn’t have to be restricted to CSAM. That’s just the family-friendly excuse.
People here are not okay with a local scanner installed on your device. Despite Apple “promise” it will be only used for photos and those you “intend” to upload…. this falls on deaf ears. If you were only going to scan content intended for the cloud…. then just scan it on your servers when it arrives.
Despite talks of E2E … apple has not said this is the reason for the local scanner.