And then what? Do you just think that humans are just going to cede that part of existence to that AI, in a fit of self-fulfilling prophecy? Cutting off the source of novel training data, thereby stagnating any progress of humanity at "we made a 1% better than the best human on a bad day AI once"?
This is the central conceit of AI maximalists; and tech maximalists as a whole imo. Someone wants something that fits in a box that everyone else must eventually delegate to. Given capitalism's focus on creating owned systems, and the tendency to centralize and maintain asymmetry of access to rent extract, this is a most concerning direction to keep pushing development in.
We should be building better people; but we've abandoned that in favor of making a better mantrap.