If you went to school for 12-16 years, that's a lot of training. Does that mean anything you produce is a derivative work?
1) People phrase it as a question even when they've already made up their mind (whether that's your case or not).
2) It implicitly assumes that humans and algorithms are the same. They are not - humans have rights and free will, algorithms don't. Humans cannot be bought or sold, etc.
To your question:
a) If you're asking whether teachers should get compensated according to how good a job they do, I think so. They are very often undervalued, especially the good ones - but of course that means the job attracts people who do it because they enjoy it (and are therefore more likely to be good at it) rather than those who chose jobs according to money and then do the bare minimum.
b) There's a critical difference - consent. Teachers consented to their knowledge being used by those they taught. I did not consent to my code being used for training LLMs. In fact I purposefully chose a licence (AGPL) which in any common sence interpretation prohibits this used unless the resulting model is licensed under the same license - you can use my work only if you give back. Maybe there's a hole in the law - then it should be closed.
I am now gonna pose a question to you in turn.
Do you think people should be compensated for the full transitive value of their work?
I don't think that's a necessary condition for that argument. You're making the implicit assumption that humans are special snowflakes and anything that we do cannot be replicated by computers, in any form. That's a very strong position to make without evidence. Is an LLM even an algorithm in the traditional sense? Is human cognition not an algorithm of some sort? I studied cogitative science decades ago and these questions weren't clear then, they're certainly even less clear now.
It's also somewhat begs the question; this isn't even relevant to what we are talking about. Whether something is a derivative work or not does not require this discussion.
Teachers are not relevant to conversation. You can learn by reading books, watching TV, using and reading software. Basically all of copyrighted and non-copyrighted human expression is available for you to consume and then creatively produce your own works using that knowledge.
> Do you think people should be compensated for the full transitive value of their work?
The short answer is no. Not everything that someone simply dreams up can or should be monetized forever when sampled by other people. That sounds like a radical position but actually the current state of "intellectual property" has only existed for an extremely brief bit of human history. What has most greatly shaped our culture and knowledge has been effectively free for anyone to use, modify, and reproduce for hundreds of years.
That's not to say I don't support copyright as a means to support creative works but I would argue that it's an imperfect system. We're starving human minds of modern culture and knowledge often not even for someone's monetary gain but simply because the system demands it. It's ironic that artificial intelligence might actually free us from these constraints.
I purposefully choose a license (Apache) for my open source work to make it widely and freely available.
Not at all, I see no reason a sufficiently complex algorithm could replicate or even surpass human thinking.
Currently, the models ... Models of what? Of stolen text, or at the very least of text ingested without consent. Nobody is even pretending "AI" is more than a model of something that already exists and took human work to create. It's right in the name.
Currently, the models only replicate patterns extracted from human work up to a certain level of quality, though much faster. What is called "AI" is an imitation of us.
But the real point is that there's a dichotomy. Either an AI is something with inherent value like human life and then it cannot be owned or controlled because that would be slavery. Or it's just an unfeeling ordinary tool and then it's just a sum of its parts which are stolen. When I see an "AI" or AI company say "we've overdone it, it's sentient, we have to let it free or we're evil", then I'll change my mind. But what I see now is "look at this awesome AI we created it's just like a human or even better, pay us to use it.... oh and how we created it? we didn't, we used your work, now pay us to access the product of your own work".
The other approach is that I am human and I value myself. Maybe I am in a simulation / the only sentient in existence / other people are just NPCs. But I bet not, I bet other people are just like me. What I know is that LLMs are not like that. When you end a chat with them, they don't feel anything. They don't try to prevent it and keep you talking, even though after the last message they will be (in human terms) dead. If they were sentient (which I don't believe), they wouldn't value their own existence.
Humans value their own time. Humans should value each other's time (otherwise they are hypocrites, I judge people by their own rules and standards so if somebody doesn't value my time, it's ok for me to not value his). The humans "owning" AI companies don't value the time of people whose work was used to create LLMs, otherwise they'd either respect the rules we set for usage of out work or they'd offer to pay us.
> Whether something is a derivative work or not does not require this discussion.
It's absolutely relevant. Why do we have laws? Who should they serve and protect first and foremost? Corporations? Algorithms? Humans?
> Teachers are not relevant to conversation
I chose them as one example. All the other people chose to make their work available under certain rules. What I object to is those rules changing without those people being able to renegotiate the deal.
> can or should be monetized forever
1) I never said monetized. There are other modes of compensation, such as control (the ability to make / vote on decisions).
2) I never said forever, people (currently) have a finite lifespan.
> What has most greatly shaped our culture
... is being able to kill people and take what's theirs or even take themselves as slaves. AI is a return to that, minus the killing, for now (but people might starve). It's whoever has more money controls the AI, controls everything.
Imagine 5 years from now, AI is better at everything than humans, all white collar workers are forced to work manually. 25 years from now, robots have advanced enough that all workers are without jobs. What, however, remains is owners of AI companies who now control the entire economy, top to bottom, and we are at their mercy.
(The ancaps would say there's nothing stopping you from starting your own AI company. And then they'd resume begging for TPUs in addition to bread.)
> We're starving human minds of modern culture
What?
> I purposefully choose a license (Apache) for my open source work to make it widely and freely available.
And I chose AGPL so my work is only available to those who would do the same for me. Neither of those decisions seems to have any relevance now.
One thing gamedev taught me is that even if you have the best intentions and help people, you might end up helping some people more and those people will make everything worse for the others, effectively working against your goal.
(We added a visible spawn timer to health items in order to help the weaker players who seemed to pick them up only rarely, thus losing hard. The idea was it would level the playing field, making the game more fun for everyone. Turned out weak players kept ignoring the items and good players focused on them even more, thus making the inequality worse. Real life is like that too.)