Whoa. Is this something with GPT4? Implementing a database integration (creation, insertion, querying) with gpt-3.5-turbo was difficult. I’m on my phone right now, so pardon if I didn’t read all the context.

The forum category is “plugin-development”. The first screenshot shows a localhost api to interface with the non-public plugin under development. Unless there is a massive history of the plugin in its current form working properly and just today ChatGPT throwing the error with no changes in either plugin or database or API, the likely problem is in what the developer is doing with the plugin.

1 Like

ahhh, I understand. Thank you sir/ma’am!

Ive seen the error a few more times. I can confirm that its not just with my plugin, it happens randomly. However, now hitting retry seems to be working. I just wish i didnt have to kill that 1/25.

edit: I take it all back. It got worse, and I realize even GPT3 has been bugging on me. Looks like Im just gonna quit for the day.

Also logs show normal.

FWIW I am also seeing “This model’s maximum context length” since yesterday, in a plugin I’m developing. First interactions are ok, but the 3rd or 4th will give the error.

Though the responses the plugin returns to ChatGPT are large, they never hit ResponseTooLargeError, so it is unclear to me how to go forwards. I’ve tried to limit the prompt/ai instructions length, but still get the error.

Also have the impression this is new since yesterday, but not because the plugin responses have gotten larger.

Hmm I’m not sure its related any one plugin. Its happening to me no matter which one I use.

The model responses in general seem to have gotten way worse as well.

I too got same error many times to day.

This model’s maximum context length is 8193 tokens, however you requested 8689 tokens (7153 in your prompt; 1536 for the completion). Please reduce your prompt; or completion length.

And 3’s a pattern. make sure youre reporting those errors.

Im getting the same thing.

This model’s maximum context length is 8193 tokens, however you requested 8195 tokens (6682 in your prompt; 1513 for the completion). Please reduce your prompt; or completion length.

Also did anyone else’s chatgpt turn into Hawaiian? it doesnt stop saying aloha or mahalo to me now with flower and tree emojis. so weird

That is unusual, possibly some server side issue while modifications are made since the recent announcements .

Now that I look further, I noticed that plugins for new chats appear to be ‘disabled’, but not in a way that feels official:

Seems at the very least the plugin section is under maintenance.

1 Like

I have also been running into this error with GPT-4 + plugins. Seems to be a bug, has anyone found a way around it?

While we can’t create new chats with plugins, if you have an existing chat with existing plugins, I’ve had success hitting “Regenerate response” and it’ll go through. Unfortunately, since this effectively results in your message threshold going from 25 messages per 3 hours to 12.

1 Like

That seemed to work, I did have to give it a few minutes beforehand though. I burned through 5/25 with it erroring out. After letting, it breathe, closing and refreshing the page, and then regenerating it seems to work.

Curious to know what may be happening under the hood. Thanks for the help here @acesonnall

1 Like

Welp, called it too soon. Could be linked to the latest announcement about GPT-4 deprecations.

Now it is erroring out during the use of a plugin and when I leave the tab and then return my message history momentarily vanishes.

EDIT: And the pièce de résistance, when thumbs downing and reporting the entire interface went down with an “Oops, there was a problem.”

Hopefully we’re back to normal tomorrow.

1 Like

The problem is in the Plugins service.

i have the same error and all plugins stopped working what is going on ???

Please be patient while maintenance work is done on the Plugin system for Code Interpreter and other functionality updates. It’s affecting everyone, these things are usually breif.

2 Likes

The errors seem to have abated. I get maintenance happens, I more just wanted to make sure thats what it was, and not something i did.

1 Like

I haven’t seen the max content length error anymore today. Also… can’t shake the feeling that there is a notably smarter gpt4 behind the plugin runner(?)