This is one the AI companies should offer the olive branch on IMO, there must be a way to use stenography to transparently embed a "don't process for AI" code into an image or text or music or any other creative work that won't be noticeable by humans, but the AI would see if it tried to process the content for training. I think it would be a very convenient answer and probably not be detrimental to the AI companies, but I also imagine that the AI companies would not be very eager to spend the resources implementing this. I do think they're the best source for such protections for artists though.
Ideally, without a previous written agreement for a dataset from the original creators, the AI companies probably shouldn't be using it for training at all, but I doubt that will happen -- the system I mention above should be _opt-in_, that is, you must tag such content that is free to be AI trained in order for AI to be trained on it, but I have 0 faith that the AI companies would agree to such a self-limitation.
edit: added mention to music and other creative works in second paragraph 1st sentence
edit 2: Added final paragraph as I do think this should be opt-in, but don't believe AI companies would ever accept this, even though they should by all means in my opinion.
Now for the second type, representing models such as Stable Difusion and Chat GPT, it would be required to have their trained model freely available to anyone and any resulting output would not be copyrightable. It may be a more fairer way of allowing anyone to harness the power of AI models that contain essentially the knowledge of all man kind, but without giving any party an unfair monopoly on it.
This should be easily enforceable for big corporations, else it would be too obvious if they are trying to pass one type model as another or even keep the truth about their model from leaking. It might not be as easy to keep small groups or individuals from breaking those rules, but hey, at least it evens the playing field.
Of course the workaround would be to have multiple accounts, but that in turn can be made unscalable with a "prove you're human" box.
- This is still vulnerable to stuff like mturk or even just normal users who did get past the anti-bot things pulling and re-uploading the content elsewhere that is easier for the AI companies to use
- The artists' main contention is that the AI companies shouldn't be allowed to just use whatever they find without confirm they have a license to use the content in this way
- If someone's content _does_ get into an AI model and it's determined somehow (I think there is a case with a news paper and chatGPT over this very issue?), the legal system doesn't really have a good framework for this situation right now -- is it copyright infringement? (arguably not? it's not clear) is it plagiarism? (arguably yes, but plagiarism in US court system is very hard to proof/get action on) is it license violation? (for those who use licenses for their art, probably yes, but it's the same issue as plagiarism -- how to prove it effectively?)
Really what this comes down to is that the AI companies use the premise that they have a right to use someone else's works without consent for the AI training. While your suggestions are technically correct, it puts the impetus on the artists that they must do something different because the AI companies are allowed to train their models as they currently do without recourse for the original artist. Maybe that will be ruled true in the future I don't know, but I can absolutely get why artists are upset about this premise shaping the discussion on AI training, as such a premise negates their rights as an artist and many artists have 0 path for recourse. I'm pretty sure that OpenAI wouldn't think about scraping a Disney movie from a video upload site just because it's open access since Disney likely can fight in a more meaningful way. I would agree with artists who are complaining that they shouldn't need to wait for a big corporation to decide that this behavior is undesirable before real action is taken, but it seems that is going to be what is needed. It might be reality, but it's a very sad reality that people want changed.