Actually, replication is very important. If no one can make new llamas, that would mean that facebook used some secret sauce in their training. Understanding publicly how to train these 'enhanced' models that shows performance of much greater models is a very strong motive.
And getting hid of the NC clause of the original llamas too, of course.
As of right now, there's trouble replicating the eval results of the paper, for example.