May 2024 - ChatGPT (web version), streaming in chunks, lagging

Since a few days ago, I have noticed that GPT4, the web version, is not properly streaming the text as it used to. It now instead completely freezes, and doesn’t show any updates until 30 seconds later, when it adds a large chunk of text.

It’s also much slower than it used to be. I now sometimes have to wait for 2 minutes to get a moderately long GPT4 reply.

At this rate I’m definitely switching to another provider. The lack of support from OpenAI, the lack of visibility in the community is just ridiculous.


concur for the last several days it’s been excruciatingly slow. The mobile app version instant…well normal response speed

Same. I’ve had to refresh the page just to see any of the produced text. Its been going on for 2 days now.

1 Like

I have the same thing exactly like what you are describing. I just created an account to reply and tell you that I have the same. But wait, didn’t it behave like the new GPT2 that is currently released by OpenAI? Is it like a new version of Model 4? Thoughts?

1 Like

same, very, very sluggish. Mobile version unaffected

I’ve had the same issue since last week. Using chatgpt 4 in firefox, I notice the entire browser page freezes and I am unable to refresh the page via the refresh button or right clicking tab and manually reloading until the page eventually unfreezes, if it ever does.

Clicking the stop icon can sometimes unfreeze the page and show the rest of the output, but most of the time the output is incomplete. There are errors being thrown in the console log pertaining to the stream:

FatalServerError: Our systems have detected unusual activity from your device. Please refresh your browser and try again later.

NextJS 19

Object { errCode: “” }

[vendor-53a745a79e6c51ba.js:1:150561](‘removed logged cdn link due to forum restriction’)

TypeError: ReadableStream.getReader: Cannot get a new reader for a readable stream already locked by another reader.

NextJS 2

The ReadableStream typeerror repeats every minute or so while the page is frozen. Hopefully this is resolved soon as it is a very frustrating experience.


And suddenly, after today’s presentation, everything is normal again.

It feels like it was likely some kind of updates that they were doing to their system.
This is fine, they have to do that.

The shitty part is not communicating about it.

Why not put a notification to paying users saying that service is bound to be a bit choppy for 2 weeks while they run updates on their servers? Now we have tons of people not understanding what’s wrong, and people that will stop subscribing to OpenAI, all because they are unable to communicate.

Please improve your community management.

1 Like

have also experienced lagging for the past few weeks

1 Like

For me too, very laggy on Firefox but somehow on Chrome it works… I don’t like being forced to switch to a different browser

1 Like

It’s crazy, I get no answers on my computer browser. The choice chat only stutters on my phone app. Crazy slow and laggy… working off safari from Mexico. What is up?

guys imo you need to try either updating your tamperomonkey scripts if you use such
or more likely you haves some long conversation you try to load. just make new conversation
or else yea its probably the chatgpt doing a big part of the lag but those 2 might help