Creating a PlugIn that calls ChatGPT directly to process a Prompt

Is there a way to use the parameters that ChatGPT asks for in my plugin to create a Prompt directly back to ChatGPT? I can call out to an outside API that I’m working on but that goes against my API key. I’d like to let the plugin do that lifting so that the tokens go to the plug in user.

Why would the user need a plugin for that?

In our case the plugin needs to pass data to an API on our server which needs to call ChatGPT for some further processing. If we do that call from our server then it has to use our OpenAI API key and the tokens go against us. We’d like to let that token use go back to the user that’s using our plugin.

You can define in the openapi definition how to handle the data. Go to http://pugin.ai and see how others did it.

One thing that might be helpful is creating a recursive prompt. That is, in your first plugin output, provide an instruction for chatGPT to hit your plugin again with a modified prompt. In our case, the user provides ChatOCR with a URL. ChatOCR responds to the client with a “OCR in progress” message, but the body of the response contains a job_id and has_more variable.

ChatGPT automatically sends a new request back to ChatOCR with the job_id and fetches the results.

Hope this helps!

2 Likes