That’s likely to be the middle ground going forward for the smarter creative companies, and I’m personally all for it. Sure, use it for a pitch, or a demo, or a test - but once there’s money on the line (copyright in particular), get that shit outta there because we can’t own something we stole from someone else.
And yes I know they do legal and agreed partnerships like with the Predator franchise, or the Beavis and Butt-Head franchise (yes they exist in CoD now...), and those only count for a tiny number of the premium skins.
It goes beyond just IP law compliance. Creativity is their core competency and competitive differentiator. If you replace that with AI slop, then your product becomes almost indistinguishable from that of everyone else producing AI slop.
IMO, they're striking exactly the right balance - use AI as a creative aid and productivity booster not something to make the critical aspects of the final product.
How does anyone prove it though? You can say "does that matter?" but once everybody starts doing it, it becomes a different story.
The scenario looks like this:
* Be Netflix. Own some movie or series where the main elements (plot, characters, setting) were GenAI-created.
* See someone else using your plot/characters/setting in their own for-profit works.
* Try suing that someone else for copyright infringement.
* Get laughed out of court because the US Copyright Office has already said that GenAI is not copyrightable. [1]
[1] https://www.copyright.gov/ai/Copyright-and-Artificial-Intell...
- https://arstechnica.com/tech-policy/2025/02/meta-torrented-o... - https://news.bloomberglaw.com/ip-law/openai-risks-billions-a...
Other than that, just a bit of common sense tells you all you need to know about where the data comes from (datasets never released, outputs of the LLMs suspisciously close to original copyrighted content, AI founders openly saying that paying for copyrighted content is too costly etc. etc. etc.)
Later on they do have a note suggesting that the following might be OK if you use judgement and get their approval: "Using GenAI to generate background elements (e.g., signage, posters) that appear on camera"
They do want to save money by cheaply generating content, but it's only cheap if no expensive lawsuits result. Hence the need for clear boundaries and legal review of uses that may be risky from a copyright perspective.
But what word should we coin as buzzword for “Netflix-Muzak”?
And when we're saturated with it all, we'll start buying DVDs (or other future media) again.
* fully-generated content is public domain and copyright can not be applied to it.
Make sure any AI content gets substantially changed by humans, so that the result can be copyrighted.
More importantly: don't brag and shut up about which parts are fully AI generated.
Otherwise: public domain.
Some people keep saying this but it seems obviously wrong to me.
At least in the United States, “sweat of the brow” has zero bearing on whether a work is subject to copyright[1]. You can spend years carefully compiling an accurate collection of addresses phone numbers, but anyone else can republish that information, because facts are not a creative work.
But the output of an AI system is clearly not factual! By extension, it doesn’t matter how little work you put in—if the work is creative in nature, it is still subject to copyright.
1: https://en.wikipedia.org/wiki/Sweat_of_the_brow#United_State...
(IANAL, yadda yadda.)
Simpler yet - and inevitable, on sufficiently long time scales - is to dispense entirely with the notion of intellectual property and treat _all_ content this way.
They do not want to be disrupted.
Just look at early 20s people. They don't watch shows/movies. They only watch short form videos. Short form videos will mostly be created using GenAI tools as early as 2026.
Each time I scroll LinkedIn and I see some obviously AI produced images, with garbled text, etc. it immediately turns me off to whatever the content was associated with the image.
I'd be very disappointed to see the arts, including film making, shift away from the core of human expression.
“You know what the biggest problem with pushing all-things-AI is? Wrong direction. I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes.” - Joanna Maciejewka
It's that not every one has the talent to produce something of quality.
If you give a professional passionate chef, the same ingredients for a full meal, as your average home cook the results will NOT be the same by a far stretch.
Much of "AI slop" is to content what Macdonald's is to food. Its technically edible but not high quality
Do we want a society where everyone can masquerade as an “artist”, flooding society with low-quality content using AI trained on the work product of actual artists?
The people doing as such do not have the talent they desire, nor did they do anything to upskill themselves. Its a short cut to an illusion of competency.
Is that just because we are at the very beginning stages of the technology, though? It is just going to keep getting better, will the bias against AI generated content remain? I know people like to talk as if AI will always have the quality issues it has now, but I wouldn't count on that.
Like, I gather that prompt adherence has improved somewhat, but the actual output still looks _very_ off.
How would one ever know that the GenAI output is not influenced or based on copyrighted content.
If you take a model trained on Getty and ask it for Indiana Jones or Harry Potter, what does it give you? These things are popular enough that it's likely to be present in any large set of training data, either erroneously or because some specific works incorporated them in a way that was licensed or fair use for those particular works even if it isn't in general.
And then when it conjures something like that by description rather than by name, how are you any better off than something trained from random social media? It's not like you get to make unlicensed AI India Jones derivatives just because Getty has a photo of Harrison Ford.
“Creative Output” has an entirely different meaning when you start to think about them in the way they actually work.
The Gooner Association?
This is for studios and companies that are producing content for Netflix.
If you want to sell to Netflix, you have to play by Netflix's rules.
Netflix has all kinds of rules and guidelines, including which camera bodies and lenses are allowed [1].
[1] https://partnerhelp.netflixstudios.com/hc/en-us/articles/360...
... Of course it is. As the distributor, Netflix obviously has a fairly broad ability to control what it distributes.
This is 100% a lie.
Studios will use this to replace humans. In fact, the idea is for the technology – AI in general – to be so good you don't need humans anywhere in the pipeline. Like, the best thing a human could produce would only be as good as the average output of their model, except the model would be far cheaper and faster.
And... that's okay, honestly. I mean, it's a capitalism problem. I believe with all my strength that this automation is fundamentally different from the ones from back in the day. There won't be new jobs.
But the solution was never to ban technology
Any studios that isn't playing ostrich has realized this (so possibly none of them) and should be just trying to extract as much value as possible as quickly as possible before everything goes belly up.
Of course timelines are still unclear. It could be 5 years or 20, but it is coming.
In this case, for instance, Netflix still has a relation with their partners that they don't want to damage at this moment, and we are not at the point of AI being able to generate a whole feature length film indistinguishable from a traditional one . Also, they might be apprehensive regarding legal risks and the copyrightability at this exact moment; big companies' lawyers are usually pretty conservative regarding taking any "risks," so they probably want to wait for the dust to settle down as far as legal precedents and the like.
Anyway, the issue here is:
"Does that statement actually reflect what Netflix truly think and that they actually believe GenAI shouldn't be used to replace or generate new talent performances?"
Because they believe in the sanctity of human authorship or whatever? And the answer is: no, no, hell no, absolutely no. That is a lie.
>> This is 100% a lie.
We’ve had CGI for decades and generally don’t mind. However, the point at which AI usage becomes a negative (eg: the content appears low quality) because of its usage, I’d expect some backlash and pulling back in the industry.
In film and tv, customers have so much choice. If a film or tv is low effort, it’s likely going to get low ratings.
Every business and industry is obviously incentivized to cut costs, but, if those cost cuts directly affect the reputation and imagery of your final product, you probably want to choose wisely which things you cut..
However, this statement is a hell of a lot better than I expected to see, and suggests to me that the actors' strike a few years ago was necessary and successful. It may, as you say, only be holding back the "capitalism problem" dike, but... At least it's doing that?
When AI gets good enough, 2, 3, 5, 10 years from now, they simply reverse path, and this statement wouldn't delay Netflix embracing AI films that much, if anything.
> I would somewhat disagree with this statement being a sign the strike was a success because, like, AI is not at the point of generating a whole movie in human quality today, so Netflix issuing this statement like this now, in November 2025, costs them literally nothing, and feels more like a consolation prize: "Here, take this statement, so you guys can pretend the strike achieved anything."
>
> When AI gets good enough, 2, 3, 5, 10 years from now, they simply reverse path, and this statement wouldn't have delay Netflix embracing AI films that much, if anything.
For every person who gets to make creative decision, there are hundreds upon hundreds of people whose sole purpose is slavish adherence to those decisions. Miyazaki gets to design his beautiful characters - but the task of getting those characters to screen must be carried out by massive team of illustrators for whom "creative liberty" is a liability to their career.
(And this example is only for the creative aspects of film-making. There is a lot of normal corporate and logistical stuff that never even affects what you see)
That's not to say I'm looking forward to the wave of lazy AI-infused slop that is heading our way. But I also don't necessarily agree with the grandstanding that AI is inherently anti-creative or only destructive. I reserve the right to be open-minded.
The irony is that movies and TV themselves represented a cheaper, industrialized and commoditized alternative to theater. And theater is still around and just as good as it ever was.
This is vastly oversimplifying and is misleading. Key animators have a highly creative role. The small decisions in the movements, the timings, the shapes, even scene layouts (Miyazaki didn't draw every layout in The Boy and the Heron), are creative decisions that Miyazaki handpicked his staff on the basis of. Miyazaki conceived of the opening scene [0] in that film with Shinya Ohira as the animator in mind [1]. Even in his early films, when he was known to exert more control, animator Yoshinori Kanada's signature style is evident in the movements and effects [2].
[0]: https://www.sakugabooru.com/post/show/260429
[1]: https://fullfrontal.moe/takeshi-honda-the-boy-and-the-heron-...
[2]: Search for "Kanada animated many sequences of the movie, but let’s just focus on the most famous one, the air battle scene." in https://animetudes.com/2021/05/15/directing-kanada/
Yes but at least those decisions come from some or one person not just an algorirhm
Some skills, like framing, values, balance, etc. become even more important differentiators. Yes, it is much different. But as long as humans are in the loop, there is an opportunity for human communication.
I'm curious if the parent poster thinks this is unique to film production, because I think you can make the same argument for pretty much any trade. Software engineering is 1% brilliance and 99% grunt work. This doesn't make that software engineers are going to enjoy a world where 99% of their job goes away.
Further, I'm not sure the customers will, because the fact that human labor is comparatively expensive puts some checks and balances in place. If content generation is free, the incentive is to produce higher-volume but lower-quality output, and it's a race to the bottom. In the same way, when content-farming and rage-baiting became a way to make money, all the mainstream "news" publishers converged on that.
https://www.equity.org.uk/advice-and-support/know-your-right...
Common-sense, practical, and covers a lot of the shifting ground around an artist’s ability to withdraw consent under GDPR and the ways they can properly use this to prevent their likenesses being used to train their digital replacements.
(Equity is the UK equivalent of the AEA and SAG-AFTRA combined)
I wonder if we're going to see a push back by media companies around copyright over AI-generated content. Though I don't see how; copyright is explicitly an artificial legal protection of human works.
The irony is rich they built their empire on disrupting old Hollywood gatekeeping, and now they’re recreating it in AI form. Instead of letting creators experiment freely with these tools, Netflix wants control over every brushstroke of ai creativity
I do agree Netflix wants to crush creators.