not at all. AGI is so far away, maybe my kids kids's kid's kids will see it... LLMs no matter how its marketed is no where near it. and integrating LLM into other tooling to 'control' things is also not AGI.
that being said its plausible some idiot will connect an LLM to something potentially destructive and cause a disaster. i dont think otd be some global skynet style meltdown though. maybe some brudge that opens when it should close or a dam that breaks or whatever.
that being said, if there was such a threat, my wifes from a nomadic tribe im the sahara so thats my plan. go into the sand where there is nothing for such a system to exert control on, nor anything for it to gain.
its also unlikely an AGI would see such an untechnological tribe or peoples as a threat to any of its objectives or goals so its unlikely to spend resources on it.