Do I think that society will transform its views on nude images overnight? No, obviously not. Do I think that every person in society will go with the flow and adopt new assumptions about nude images? Also no. But it seems entirely logical to me to imagine that once the average person notices "hey, I am being inundated with 1000x the number of fake nudes that I used to," they will eventually reach the conclusion that "I should start assuming all nudes are fake by default."
Like I get that it feels cozy to think that it'll just blow over but it ends up being a "thought terminating cliche" that gets repeated ad nauseum in these conversations. I think it's reasonable even if you think the problem will pass to have some idea of what to do if it doesn't. And that's more productive!
I feel like education is probably our best bet for sort-of-kind-of dealing with the repercussions of the problem. Ultimately, attempting to legislatively regulate the production and private distribution of AI nudes just doesn't seem enforceable to me — see: Brandolini's principle. I think our best hope to mitigate the negative impact of this tech is to talk widely about it and actively try to make my "prophecy" come true. Sex ed curriculums should start including warnings about how fake nudes are an epidemic and begin to build the new social norm of not assigning moral judgements based off of them.
I could also see some government-funded PSAs on popular streaming services/TV channels/radio channels being effective. You'd want to try to disproportionately target such messaging to older members of society, since they're less technologically literate on average and therefore more likely to be emotionally traumatized by tech like this.
Now that I'm writing this, I do wonder if there are some regulatory approaches that could help, especially since we have had some success banning CSAM legislatively. The problem with AI nudes, I think, is that they are destined to become indistinguishable from genuine nudes, so a blanket ban is impossible without establishing a panopticon (which is obviously bad). Blanket banning works for CSAM because there is never any case in which a piece of CSAM should ever be knowingly stored or distributed, regardless of its origin. It is always immoral. This complicates things because nude images of adults can be shared consensually by said adults, and banning this practice would be absurd.
I'd be curious to hear your thoughts as well. What sorts of options are there to combat this new AI-generated nude plague if my optimism about the situation turns out to be misplaced?