Local plugin failure via localhost - plugin id not found

I have been working with a custom plugin for months and this morning it broke. I uninstalled and reinstalled and everything installed OK. However when I try to use the plugin getting an error:

Plugin for id complete-manifest-dc456121a-d965-5e92-1a42-821b459eb7f6 not found

Reinstalled again, and showing this in the console:


POST https://chat.openai.com/backend-api/conversation 400 (Bad Request)


Yeah observing the same behaviour too!

While installing the plugin it shows the plugin and imports the json properly, but making a request showing this error!
Refreshing the window removes the plugin :frowning_face:

same issue, can not use local plugin at all
“Plugin for id complete-manifest-xxx not found”

1 Like

Same issue here. It has been broken for 12+ hours now.

1 Like

I am too.
I have been developing my plugin on localhost.
But, I just noticed that the local host plugin is not available.

Error message is same, “Plugin for id complete-manifest-XXXX not found.”.

Is this bug of ChatGPT Plugin in OpenAI?
If it is, I wait fixed.

I am having the same issue, i was actually relived to see others having it too. Hopefully they fix it soon

Same here pretty huge issue for us.
It’s probably because of the deployment of all the new stuff

Going on over 24 hours now. Wish OpenAI would at least acknowledge they plan to fix?


Same here. ChatGPT Plugin Development has completely halted overnight.

@logankilpatrick Could you help?
I also got the new update and the localhost plugins are failing with plugin manifest not found

Love what you guys are doing, this happened after the latest update you rolled out, but please, you guys can’t keep breaking existing functionality withe every new update :slight_smile:


Can you add this post to the Bugs category they might find it easier there

1 Like

I too am having this error. Everything was fine and working Monday morning of this week - then I had a demo and this error started. Local plugins are broken.

Yep, just added to the plugin-bug category.


Same problem for me.

I just recently gained access to the new GPTs and was able to create a “Action” using ngrok but have not tried localhost as of yet. This is on my todo list for the next few days. So in the meantime if you gain access to GPTs look into a tool like ngrok in order to limp along your projects. Other wise wait?

I wish we had better communication from OpenAI on if localhost is going to be supported, etc. Maybe at best they can learn to communicate these types of changes a bit better to us non whale users?

Either way this new GPT with custom instruction up to 8000 chars is EXCELLENT! The changes have been very well thought out!

Just tried and it doesn’t seem to work unless I’m missing something, but I don’t see them hitting my endpoint, if you put any public endpoint it works, not for localhost, which would be a huge disappointment if they removed it.

Certainly agree about comms, they’ve been really light on their comms which is disappointing!

This was my assessment as well. Unless you have GPTs access then the older code seems to be cut off which makes total sense but we are back to communicating better.

And to add to that, I really like the direction they are going with custom context Per “GPT” of 8K chars is exactly what was needed!

EDIT: using ngrok works! localhost=no.

I have the same problem. Is this a bug, or the local plugin development no longer works?

For anyone still reading this and waiting for a fix, they have moved this functionality to actions. OpenAI Platform

If that is true, then how can we develop custom actions with localhost? Is it no longer possible?