https://www.adfsolutions.com/news/what-is-csam
> However, the phrase “child pornography” is almost too sterile and generic to properly exemplify the horrors of what is being created. That is why many advocates, including the National Center for Missing and Exploited Children (NCMEC), believe this phrase to be outdated.
> NCMEC refers to these kinds of material as Child Sexual Abuse Material (CSAM), in order to “most accurately reflect what is depicted- the sexual abuse and exploitation of children”.
> As a result, many organizations and advocates now refer to this material by the new term rather than "child pornography" because it explicitly ties the material to the source of the problem; the abuse that is being perpetuated to create it. Furthermore, children are re-victimized every time a file is shared, sustaining the abuse in a continuous loop.
'Child pornography' is already too long to say repeatedly in a sentence so it is often shortened to "child porn." CSAM has far too many syllables so it too is shortened to two syllables in the form of the spoken acronym "CSAM" and so we're back to square one.
With that said this is a fascinating display someone pressing the reverse button on the euphemism treadmill. I don't think I've ever seen that before.
But children are incapable of being consenting performers. That's what separates abuse material from porn. So to make damn sure there isn't any overlap in the Venn diagram of media that depicts sexual acts, they'd rather not associate "child porn" with anything else that exists in the universe of "porn".
It's a completely separate category, not some "bad" end of a spectrum.
That's the reasoning I've heard before, anyway.
And I agree with you on the reverse treadmill thing. It's interesting. On a related tangent: I've always hated how journalists use the term "sexual assault" to refer to a wide range of offenses, from forcible rape to a passing grope. Although those are both bad things, it's clear that one is tremendously more harmful than the other. We should use language to clarify that.
I think saying “child porn” is clearer. That term is horrific as it is. If anyone sees that in a sentence they’ll be rightly revulsed and know what’s being talked about right away.
I had to have someone explain the CSAM acronym. How many people are going to skip over that because they assume its something benign and unrelated?
Using scarier words will get the public to trade liberty for security every time.
I've seen this argument many times, and I agree that initial act is horrendous of course, but I believe this is overstating the ongoing damage.
From what I recall of the episode, porn (rightly or wrongly) is seen as opt-in for the participants. But if children are involved it cannot be opt-in and is more appropriately described as rape or sexual abuse. Thus CSAM is the preferred term.
Should both parties be imprisoned and go on sex offenders registries for life?
CSAM provides a library with hashes that are compared against the photos on the phone.
The real problem comes when governments make laws that open the mechanism to different libraries of photos.
For example, parents' photos of kids in the bath isn't CP. However _someone else_ having a _quantity_ of bath photos is CSAM, if they have no reasonable reason for possessing them.
We definitely should punish exploitation of children, including sexual, and we definitely should punish distributing images of this. But conflating exploiting, distributing, and viewing, and then putting a big taboo on this, is really not ideal. These things are different. Otherwise what we end up with is righteous frenzy when someone gets punished for sexting.
For example, the NCMEC database contains hashes for Nirvana's Nevermind cover. Completely innocent to possess within it's original and intended context.
I have not said whether I agree with this, because I do see problems when automating the process.
However, the precedent for it being used as a flag for law enforcement has already happened. Having a collection of similar imagery is considered CSAM - and that is likely correct. A collection is probably not innocent. But having one or two images may happen incidentally without your awareness of it.
As to who decides what constitutes significance? That is where you'll hit the most problems, and reasonable discussion of it will be quickly shut down with the same arguments used for automating a flagging system. The conversation requires nuance, but those currently calling for such systems aren't interested in a good faith discussion.