Getting this below error for my conversations. Anyone has this problem?
This model’s maximum context length is 8193 tokens, however you requested 8689 tokens (7153 in your prompt; 1536 for the completion). Please reduce your prompt; or completion length.
2 Likes
I just got this error as well. Was working fine for my earlier conversations until just now. Not sure what the problem is. I have maybe 200 tokens in my prompt.
“This model’s maximum context length is 8193 tokens, however you requested 8195 tokens (6726 in your prompt; 1469 for the completion). Please reduce your prompt; or completion length.”
2 Likes
Yep, i am getting this as well. whats strange is if i open a new chat everything is working fine. So im thinking either it has to do with plugins or just our chats. became bugged in some way.