Is this the same AI that I see most days get instructions to "not do anything destructive without explicit permission" and then go on and delete production systems?
There is reasonable doubt that it actually limits itself based upon requests.
Recent SCOTUS refusal to hear appeal could mean that clean room implementations may not be license-able at all.
If Ai produced content cannot be copyrighted, can it be licensed?
Long answer: well it depends on how much human creative input was required.
But what are the social ramifications if this kind of thing is deemed acceptable? It feels like it would effectively be the end of OSS licensing, because it's pretty straightforward to do this for any project.
Any company that wanted a proprietary copy of a program could in theory follow this same technique, with relative ease. That feels wrong.
So maybe we need to re-think the "copyrightable API" and "clean room" legal concepts. How? I don't know. But a world in which OSS licenses are easily sidestepped feels like the wrong direction.
From a moral perspective Dan wanting to relicense the project carries much more weight give his many years of contribution.
From a practical perspective the reason the rewrite went so well (significant performance boost, virtually no duplicate code) speaks to his skill and experience with the domain.
It's also a cause of friction here, because the fact that he knows the codebase so well makes him less credible as a clean-room implementer - his own biological neural weights are deeply biased by what he's learned from that existing code.