Using a ChatGPT Plugin to Discover Other Plugins

I have an idea that would be super easy to do. I’ve setup a simple prototype on localhost. Basically discover plugins using ChatGPT. I posted a couple of screenshots on how this would work. Would anyone be interested in this?

I have zapvine.com catalog for the info already setup. Any devs wanting to have their plugins discoverable just add your plugin manifest link to zapvine.


1 Like

Open the dev console in the browser.
Go to the network tab.
Select GPT4 and then go to the plugin store.
In the network tab appears “p?offset=0&limit=250&statuses=approved” as name.
Inspect this in the preview.
Copy the object from the response and voila…
You have all the plugins with their respective ai-plugin.json and openapi.yaml link.

I don’t know if this helps.

Is your local hosted plugin now able to select the appropriate plugins from the store?
Can this be used to switch plugins on and off during conversation?

6 Likes

Thanks for the info on getting the existing plugin info. I’m going to check that out.

What I put together is for devs to be able to upload their manifest to zapvine so that its discoverable through a ChatGPT plugin. The plugin would not be not integrated into any official sources, nor would it be able to switch plugins on and off.

1 Like

But the funny thing is, that with a locally hosted plugin and all the endpoints from all plugins available, the limitation of using only 3 plugins is now obsolete. You can request all endpoints locally and give back their response through your own local hosted plugin.

Really? The chess plugin has legal info on example.com? That’s a joke right?

Not sure if the dev meant it as a joke, but that’s what’s in the original manifest. It is funny that he slipped that past the approvers. https://gpt-chess.atomic14.com/.well-known/ai-plugin.json

That is an interesting idea. It would be a local proxy. It’s worth investigating. However, if the backend services required authentication, it would be problem to support those plugins.

Is there a limit on the number of endpoints for plugins?

1 Like

Excellent question. I’m going to create a test and find out what that limit is.

1 Like

Yeah… let’s find out and if there is none let’s build one that can do everything.

3 Likes

What may go wrong with so many endpoints is, that the LLM will not choose the proper endpoint.
I guess explicit prompting could help LLM steering to the wanted endpoint.

Or …

The Universal Local Plugin (ULP) do not need all endpoints included. Solving this through prompting.
For example the ULP needs only 2 or max 3 request parameters: {plugin_name} , {endpoint_command} , {kwargs}.
On the local side you then execute the endpoint to which parameters are prompted.

2 Likes

So you think the individual response parameter definitions are not used?

Do you know how long it took me to build that. I’ve wasted at least 2 hours of my life. :rofl:

Yes that works! Thanks for the awesome tip. It did pull the complete list.

You are welcome and thank you all of you.

May you share the ULP?

Or we collaborate on a repo on git. I have many more ideas:

User can ask ULP for help to explain.
User can ask for a list of all plugins.
User can ask for the endpoints and usages of a specified plugin.
etc.


In the meantime, I can request the entire list of all plugins.

What I need now is a way to request list of plugins directly with: https://chat.openai.com/backend-api/aip/p?offset=0&limit=250&statuses=approved

But it seems, that I need to make a cross-origin request with a predefined header.
Will work on this.

3 Likes

I did get an initial version of this out. I’m just manually pulling the list of plugins for now. I tried that URL through postman without much success. Its flagging me as a bot and asking for a captcha.

2 Likes

That’s fixed now - not a joke, just an oversight on my part.

The link is: ChatGPT Chess Plugin - Terms And Conditions

1 Like

You weren’t the only one, there are a few, including Expedia.

2 Likes

I’m starting to look into this as well. As the first step, I’m thinking of how to merge the OAS files. My thoughts are to use the name_for_model in the manifest as the root context of a new path, and then add the path from the API spec in the OAS. This will make it easier for the plugin to map the different apis to the correct backend service. It also makes each endpoint unique.

1 Like

I have now made it to the point where you can make any request. ULP first searches for the appropriate plugin, then the endpoints are called (with optional parameters) and a direct request is sent to the endpoint.

Unfortunately this only works with plugins that don’t have OAuth or that work in general. There are a few whose rate limit has been reached. Or again others, whose domain or server URL is not correct.

Generally ULP would work, if there was the possibility to access the endpoints without restrictions.

Which in turn means OpenAI could technically make all plugins available instead of limiting the user to only 3 plugins.