Plugin deployment / custom domains

I’ve successfully created a plugin for an open-source platform I work on. It works really well, and I find it very useful. However, most people will have this platform installed on-premise or in their own cloud. So, in theory, everyone should now be able to just type in the domain for their instance, and then Chat GPT can find all the information it needs from /.well-known/ai-plugin.json. But if OpenAI needs to “vet” every plugin, that means users cannot add their own domain.

I could, of course, create a proxy in my own cloud installation and then establish “user mapping” to handle the requests on behalf of OpenAI, but I suspect something like that would also be disallowed in the terms.
Apologies for all the rambling. I suppose my question simply is: Does OpenAI plan to make it possible to either set a custom domain for each plugin (if supported), or allow users to supply their own domain and then allow it if the name_for_model matches an allowed plugin? Alternatively, is the only way to enable users to use the plugin function I’m building into my platform to create a public proxy that OpenAI can query?

1 Like

I know you guys are super busy, but I cannot be the only one with this problem ?
Is there any plans to add a way to authorize a plugin that can have one of many backends ? Either by the user adding a custom domain ( the /.well-known/ai-plugin.json path speaks for an furure where any website potentially could have an interface useable by openai ) or add a way for the “main” domain to send back a domain after receiving the user token, so that plugin speaks to what ever server the user has selected in our backend ?
I don’t like the idea of having to proxy all trafic on behalf of the user ( especially since you might want to be able to use multiple different instances as a user )

I think it’s not very likely that you’ll get support for this.

Your options are:

  • Open source your plugin so that people can deploy their own plugins.
  • Provide a public plugin that proxies onto the real servers.

Presumably, people have to log in to access the data - so your plugin must support OAuth in some way.

Or does it use some other mechanism to authenticate with the backend?

The plugin, is already opensource, and they can allready host it them self, but only a very limited amount of people can request to publish their addons domain, so the code is currently useless for anyone who dont have plugin developer access to chatgpt

Yes, i use auth, so any data chatgpt requests is limited to what only that user has access to, but proxying that is going to be a pain, that is why i hope we can get a way people can add their plugins them self.

I can’t really see any other way of doing it. I don’t think it would be against any rules provided the API was always consistent and always talking to your backend (which would then be forwarding the requests to the customer’s backends).

It feels like a possible value add on top of the open-source platform (if you wanted to take it in that direction).

You could just wait until plugin development access is rolled out to everyone who is paying - that will happen eventually.

Another alternative would be to build chat functionality into the product - then they would just need an OpenAI API key which could just be part of their setup. The downside is that at the moment with Plugins, you are getting the chat side of things for free.

Yes, agree … I guess my main point of making this post was to raise awareness of the “issue”.
I really like they made it so you potentially could host an openapi endpoint anywhere, that chatgpt can use, but I also see the need to validate and vet the once the public are using for a while.
My thinking where they maybe add support for signing a proff token i could add to the manifest file, so you could add the addon without a developer access, or maybe add support for having plugins with multiple custom domains, or … something down that line … :slight_smile:

Until the, I will see if I can add some kind of proxy that redirects api calls from chatgpt to a custom backend, configured by the end user in “my end” …
I just need to double check if chatgpt is sending the user token for all requests ( manifest files and api ) or just the requests to api … i have custom information in the manifest file that would be usefull i could customize also “per user”

Changing the contents of the manifest file will trigger the plugin to go back into the “unverified” state - so it may be best to keep that all constant.

I’m not entirely clear on your use case. But here is a proxy for openapi services. You can have a single ChatGPT plugin that goes out and uses various services.

Cool, thank you. I will,have a look at that repo.