Is it possible to utilize a plugin with the ChatGPT API?

Is it possible to utilize a plugin inside ChatGPT API?
If it’s not possible right now, is it something that’s planned for the future?

9 Likes

Not possible right now, might be in the future but no roadmap.

6 Likes

Is it possible to attach/restrict a plugin to an api-key? That will make domain specific plugins possible and it will open a huge range of possibilities and an alternative to autotuning.

Not sure I know exactly what you mean, but check out the docs on what I think might help answer this: OpenAI API

My apologies for vague request. Let’s say we want to develop a chat interface for our website. The chat interface makes api requests to GPT-4 in the backend using one of our Api-Key.

Now let’s assume there is an option where we could have enabled certain plugins for that Api-Key (the one that we use to call openai api endpoints) so that GPT-4 would query our api endpoints when necessary.

2 Likes

This langchain example works but only for plugins that don’t require authorization. Is it possible to make this work for the other plugins like the code interpreter?

https://python.langchain.com/en/latest/modules/agents/tools/examples/chatgpt_plugins.html

1 Like

Yes, Langchain handles this and can use external plugins as well, as long as the MIT license fits the solution this would probably be the best bet for now.

It is very important for integration with real world conversation. Althought it might be simulated partially via LangChain but it is very limited. It is better to initial query from chatgpt API side instead to feed into chatgpt API in advanced since conversation might need more data than original assumption. Please help to arrange it into plan.

1 Like

@logankilpatrick Hey.

Can you please provide an estimate for when the integration of plugin usage via the ChatGPT-API will be available? Based on information gathered from OPENAI blogs and forums, we recognize that having the ability to utilize plugins on the backend side is a highly valuable feature. It grants the freedom to become less reliant on the ChatGPT UI and instead implement a user interface that aligns perfectly with the unique requirements of your organization. Thanks :pray:

1 Like

I am fairly certain if you ask Web Requests plugin to query the HTTP endpoint of the api with your bearer token included, it will build the URI correctly and quiet the API for you and return the response.

1 Like

We actually made it work for simple plugins like JoPilot using ChatGPT API 3.5.

We basically ask ChatGPT API to detect if the API needs to be called and what’d be the body. Then we call the API and pass the data using an HTTP POST request.

With ChatGPT 4 API it will work much better because currently, but can’t check because we’re still waiting for the approval.

ps: we didn’t use Langchain for this.

could you share how you use ChatGPT API to determine if it should call external API?

You can set up conditions for which you want ChatGPT in the “Custom Instructions” feature that just rolled out, and even put your tokens there. Or, you can just ask it to “Build and send an API payload for the Chat Completions OpenAI API”

It actually isn’t exactly aware of what Chat Completions are because its base model was trained before they were a thing. This is another area whee Web Requests shines – you should really ask it: “First, search the web for some usage examples for the ChatGPT “chat completions” API endpoint to refresh you knowledge model. Then, build a payload blah blah…”

It looks something like this (I asked it to be GPT-3.5-turbo-16k’s proxy. 3.5-turbo was eager to learn what he missed!

(dont worry, key is revoked).

1 Like