Does this sound similar enough to what you were doing? Was there something difficult in this that you could explain?
Aside from being completely hand-wavey in my hypothetical guess-timated implementation, i had figured the most difficult part would be piping complex actions together. "Remind me tomorrow about any events i have on my calendar" would be a conditional action based on lookups, etc - so order of operations would also have to be parsed somehow. I suspect a looping "thinking" mechanism would be necessary, and while i know that's not a novel idea i am unsure if i would nonetheless have to reinvent it in my own tech for the way i wanted to deploy.
I have a working solution to exposing the toggles.
I’m integrating it into the bot I have in the other repo.
Goal is you point to an openapi spec and then GPT can run choose and run functions. Basically Siri but with access to any API.
I'm guessing the solution looks like a model trained to take actions on the internet. Kinda sucks for those of us on the outside, because whatever we make is going to be the same, brittle, chewing-gum and duct tape approach as usual. Best to wait for the bleeding edge, like what that MinecraftGPT project was aiming at.