I've heard that some previous colonies have done quite well after the colonizers left, when the colonialism was combine with Christian missionary activity.
In more ancient history, while the Spaniards did many horrible things to natives, they also put a stop to even greater horrors, such as the mass human sacrifice at that time.
Even further back, Roman colonization was exploitative, but did also bring peace to the different regions they colonized.
So, colonization can bring about a better social structure. Given that people are often driven by selfish gain, getting something out of enacting a just social structure seems to be the best way forward.