now you can run an openai-compatible proxy for all of your "local model" needs.
currently using it as my custom provider in repoprompt
pkg includes: client, cli, and proxy