Hi OpenAI Team,
I have been using the Plugins API (Functions technically) all afternoon and I have to tip my hat to the OpenAI team, you guys did an amazing job. You even gave a price reduction which was amazing.
I don’t have any negative feedback, but more of a feature request: when using the functions in the API, it is a three-step process (detection, call, complete). I think this is fine for now, but what if you could deploy your functions on Azure and supply GPT with your function endpoint in the .Chat object in the OpenAI SDK? It would automatically call it the supplied endpoint, get the server response back and only return the completion, all in one code snippet/request. I think this would be absolutely amazing and make the API experience that much better.
I understand this is literally the first day of this being released, but I couldn’t help but not write in about this because it would be so helpful. Being that OpenAI has a partnership with Microsoft, this seems like it would be a no-brainer.
Appreciate it as always! Great job on today’s release.