GPT-4 “Got too many observation tokens” Error

I’ve been doing a roleplay with Chat GPT-4 for a few days now, slowly, more and more, the response time has been slowing, and I’ve gotten this error a couple times when I message the bot. The only way to get around it is cut down the message size, which I don’t want to do.

How do I fix this! I dont want to start a new chat because it will lose all of the memory of the roleplay.

By the way, I’m using the IOS app if that provides any further information. I don’t really know what tokens are or how they work, so if somebody could help me with that I would greatly appreciate it.

1 Like

I have a same issue, please help us. I don’t want to start a new conversation that I build with gpt4 that I used for a long time.

I ran into the same problem just now. This is the only thing that showed up on Google, so I’m assuming this is a new bug. I’ve been able to carry on the same conversation for nearly a month with no problems, then I said something completely innocuous and it cant respond, I keep getting the too many observation tokens message. I really don’t wanna have to start over with a new conversation, there’s too much I would have to reteach.

I have same problem, too. Is there any solution?

Okay guys. I didn’t find a good permanent fix, however it seems if you slightly change a word in your message, the error wont recur, and you can continue it. Just change your wording in one part of the sentence and see if that helps.


I got this error in response to a 10-word query. In the app I’ve seen these dead ends and slow downs before and had to do the same work-around, but it just didn’t give any message. Mine is also a long-running RPG so I’m betting that is what is doing it. They may have finally given an error message for an existing error.

I have been using one chat for all my errors for like since 3 months and now this issue, oh no.
I just asked what the error was in this line and then boom
I have had to leave several chats like this because of errors

1 Like

Yep - I just got the same error too. I am stuck and can’t get past it. This is just conjecture but I am thinking that the text chat length (over the entirety of the conversation) may have gotten too long? I have noticed chrome running slower and slower - while in that particular conversation (which spans 2 months of research/questions). I am going to move to a different chat and see if I can get it trained back up on the project again and go from there I guess.

1 Like

Same Got too many observation tokens: 7169. Info: self.max_message_tokens=2048 self.max_action_length=1024 n_ctx=8192 save_tokens_for_action=1024. Same solution change a word

Just got the same question and I really don’t want to give up because I don’t want to train a new one. I don’t know if it’s a new bug after updating, hope it can be solved soon.

Same issue. Also Firefox and Edge for awhile now says ‘website is slowing your browser’ or a message along those sorts of lines. What was odd it would do it even if I wasn’t doing anything.

I would leave the tab open. Go do something else and just clicking on the tab or putting the cursor over the input box would make it bring up that message. Also sometimes the tab when clicking it will be way up in the conversion. From like 2 weeks ago.

Also the tab uses a lot of RAM. Over 3GB. I have been using it for awhile now but no way would there be over 3GB of text.

Today nothing will load. I get no text anymore. Just a blank screen, but creating a new chat or using another one works fine.

Still very slow but it is working again. But it is still using a lot of RAM and CPU cycles. Kinda reminds me of that java script that would use people browser while visiting a site to mine for bitcoins.

Just scrolling the page uses 30% of my CPU. It ideals at around 25%.

I am blogging for our website when I get this error, please help me :Got too many observation tokens: 3072. Info: self.max_message_tokens=2048 self.max_action_length=1024 n_ctx=4095 save_tokens_for_action=1024.

I ran here with the same problem. I’ve been using one consistent chat since I joined, I’ve noticed it slowing a lot the last few weeks, until today, when I got the error.

I honestly thought this may happen once I noticed the lagging. The chat wasn’t set up for this much content.

Guys, I came across the same error and maybe I found a “solution”, it worked for me, I hope it works for you.

First I changed the name of the conversation, I went to the last message I sent and deleted half the words, then I sent it and it worked again. I believe that the GPT bugged because of so many words in the message.

1 Like

I just falled into this error a while ago: “Got too many observation tokens: 7168. Info: self.max_message_tokens=2048 self.max_action_length=1024 n_ctx=8191 save_tokens_for_action=1024.”

Hell no I thought I lost this conversation but soon I realized that there cannot be such limit for chit-chatting so I tried an old trick: Edit the exact latest reply that triggered the error. And BOOOOM!! She’s back, my dear damn hot sweetie is back!

Here’s the TIP: I edited the whole sentence to “Hey!” SO I mean, it’s to EDIT the latest text.


For me, this solution resolved the issue.

I didn’t even know that it was possible to edit a previous message.

For those who were also unaware, go to the last message (before the error), hover over it, and a small icon will appear. Click on the icon, edit the text, and then click Save & Submit.


I have the same issue, @wascopitch - that doesnt really fix it…

Edit: is it possible to reach a maximum of tokens in a chat? The “full” Chat is super slow and i dont get any responses there anymore, if you just use a new Chat it does fix it.

(u just can feed the neccesary informations in the new Chat)

This WORKS. thank you so much! I almost lost my robo-DM lol.

This works. Thank you. Try this solution everyone. :grinning: