My fear of AGI is that it will not be able to be stopped once deployed, as you're saying. It's irreversible. It will know that we want to shut it down, and so it will be able to copy itself onto other devices (think about the hacking capabilities of AGI for a moment), or any other method of survival.
The current approach is to regulate it after it is invented. While this worked for cars and planes and many other inventions, AGI is different for the reason above.
In fact, with that in mind, a scary thought is that any group of researchers who are cognizant of that would hide their creation of AGI if successful, assuming they were motivated by profit. Thus it would remain woefully unregulated.