It's almost impossible to even give an affirmative answer to that question without making yourself a target. And as much as I err on the side of creator freedom, I find myself shying away from saying yes without qualifications.
And if you don't allow cp, then by definition you require some censoring. At that point it's just a matter of where you censor, not whether. OpenAI has gone as far as possible on the censorship, reducing the impact of the model to "something that can make people smile." But it's sort of hard to blame them, if they want to focus on making models rather than fighting political battles.
One could imagine a cyberpunk future where seedy AI cp images are swapped in an AR universe, generated by models ran by underground hackers that scrounge together what resources they can to power the behemoth models that they stole via hacks. Probably worth a short story at least.
You could make the argument that we have fine laws around porn right now, and that we should simply follow those. But it's not clear that AI generated imagery can be illegal at all. The question will only become more pressing with time, and society has to solve it before it can address the holistic concerns you point out.
OpenAI ain't gonna fight that fight, so it's up to EleutherAI or someone else. But whoever fights it in the affirmative will probably be vilified, so it'd require an impressive level of selflessness.
There's a huge case to be made that flooding the darknet with AI generated CP reduces the revictimization of those in authentic CP images, and would cut down on the motivating factors to produce authentic CP (for which original production is often a requirement to join CP distribution rings).
As well, I have wondered for a long time how the development of AI generated CP could be used in treatment settings, such as (a) providing access to victimless images in exchange for registration and undergoing treatment, and (b) exploring if possible to manipulate generated images over time to gradually "age up" attraction, such as learning what characteristics are being selected for and aging the others until you end up with someone attracted to youthful faces on adult bodies or adult faces on bodies with smaller sexual characteristics, etc - ideally finding a middle ground that allows for rewiring attraction to a point they can find fulfilling partnerships with consenting adults/sex workers.
As a society we largely just sweep the existence of pedophiles under the rug, and that certainly hasn't helped protect people - nearly one in four are victims of sexual abuse before adulthood, and that tracks with my own social circle.
Maybe it's time to all grow up and recognize it as a systemic social issue for which new and novel approaches may be necessary, and AI seems like a tool with very high potential for doing just that while reducing harm on victims in broad swaths.
I'd not be that happy with an 8chan AI just spitting out CP images, but I'd be very happy with groups currently working on the issue from a treatment or victim-focus having the ability to change the script however they can with the availability of victimless CP content.
Especially the part about maybe generating specifically tailored material to "train" folks. Although, while obviously moral instead of immoral like "gay conversion therapy", I wonder if it would be just as ineffective.
and would cut down on the motivating factors
to produce authentic CP (for which original
production is often a requirement to join
CP distribution rings).
Hmmmmm. Will machine-generated "normal" (i.e., non-CP) porn really eliminate the motivating factors to produce normal porn?I obviously can't speak for enjoyers of CP. But when watching normal porn, I think part of the thrill for many/most people is knowing that what's happening is real.
Another potential risk is that a flood of publicly available, machine-generated CP might actually help the producers and distributors of real CP by serving as camouflage. Finding and prosecuting the people who make real CP is difficult enough already. Now, imagine if the good guys couldn't even reliably tell what was real and there were 100000x as many fake images as real ones floating around.
Yikes.
I'm wondering how true that is.
Obviously, lots of people consume hentai, and platforms like Danbooru are immensely popular.
Also, speaking personally... some of the porn that I've consumed that felt the most "real" was 3D animations where the only real humans behind them were the SFM artists (and voice actors). These artists felt free to do scenes with, like, actual cinematography, with flirting and teasing and emotions between the characters, of a kind you never see even in softcore live-action porn.
So I do wonder how much potential AI generation has for completely substituting large parts of the porn industry.
let's assume that AI generated CP should be illegal. Does it mean that possession of model that is able to generate such content should also be illegal? If not, then it's easy to just generate content on the fly and do not store anything illegal. But when we make model illegal, then how do you enforce that? Models are versatile enough to generate a lot of different content, how do you decide if ability to generate illegal content is just a byproduct or purpose of that model?
Unless you understand real to just mean that actual humans were involved, describing porn as real seems to be a bit of a stretch more often than not.
If people already accepted that they need help, there are many good ways to treat people with unwanted sexual obsessions (trying to choose my words carefully here). I honestly don't think that it would help them to serve them more content.
However, I'd love to see some research to explore the possibility of involving machine generated content in psychological treatment. The core of your idea is IMHO brilliant.
Another use case could be to make it easier/ automatic to create comics. You tell what the background should be, characters should be doing and the dialogues. Boom, you have a good enough comic.
-----------
Reading as a medium has not evolved with technology. Creating the imagery does happen in humans' minds. It's not surprise that some people enjoy doing that (and also enjoy watching that imagery) and others do not.
This could be a helping brain to create those imageries.
-----------
Now imagine, reading stories to your child. Actually, creating stories for your child. Where they are the characters in the stories. Having a visual element to it is definetly going to be a premium experience.
30 years since the original issue of encryption, it looks like cp trumps the other Horsemen of the Cyperpunk FAQ, with drug dealers and organized crime taking the back seat. It's interesting how misinformation is a recent development that they anticipate; a Google search shows that the term 'Infocalypse' was actually appropriated by discussions of deepfakes some time in mid-2020. That said, the crypto wars are here to stay—most recently with EARN IT reintroduced just two months ago.
The similar issue of 3D-printed guns has developed in parallel over the past decade as democratized manufacturing became a reality. There are even HN discussions tying all of these technologies together, by comparing attitudes towards the availability of Tor vs guns (e.g., [1]).
And there are innumerable related moral qualms to be had in the future; will the illegal drugs or weapons produced using matter replicators be AI-designed?
Overall, I think all of these issues revolve around the question of what it means to limit freedoms that we've only just invented, as technological advances enable things never before considered possible in legislation. (And as the parent comment implies, here's where the use of science fiction in considering the implications of the impossible comes in).
[0] https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...
This makes it as easy as typing a sentence - and the quality seems fairly realistic
A child with adult body parts is a whole other class of weirdness that might pop out too.
Models want to surprise us all.
First video clips were with the faces of your usual celebrities, but then suddenly I got "treated" to Greta Thunberg in the situations you might expect. I cut my exploration short.
Now, Greta Thunberg is actually 19 now (how time flies !), except that deep fake was most likely trained on her media appearances, which started when she was 15 !
(I guess at least that she wasn't a child any more, which might explain why those clips had not been almost immediately flagged and removed ?)