Also, I don’t think you fully appreciate the distance to which humans in power will scheme to preserve power long past the point of its utility. If future AI needs organic human data to improve, then we will be turned into data generation machines with money granted based on the quality, uniqueness, or importance of said data - which is kinda what Capitalism is already doing, if you squint a bit. Those systems, once entrenched, will survive long past the point of necessity provided the populace as a whole doesn’t become aware of that fact. After all, just look at the growing political extremism as more folks realize that not only is the current social contract irreparably broken (all work, no homes, no stability or security with which to take chances for most folks), but that current political mechanisms and institutions built to serve it are similarly unnecessary. It’s partly why, I suspect, Capital is latching so hard onto the idea that AI is their exit strategy, as it means their assets will continue appreciating in value along with their net worth even as the rest of the planet crumbles and burns around them - ensuring their safety, or so they think.
My point is: the future is unknowable, and you should’t underestimate the human desire to humiliate and enslave others to their will by any means necessary.