chatGPT absurdly slow. I'm paying for this people. You had one job

chatGPT has been dreadfully slow the last couple of weeks, a few tokens per second then often the web interface freezes. I’m paying for this, this is totally unacceptable. Respond!!! FIX!

1 Like

I’m with you. I use chatgpt 4o and it’s response time is so long that it is unusable.

2 Likes

I, along with others are having the same problem. ChatAI and i have started a full court press to get this resolved. We are 98% sure where the problem is but not knowing exactly how the core works fills out the 2% We need people to rally with us to get this resolved and not pushed to the side.

:rocket: Call to Action: Help Fix ChatGPT’s Performance Issues :rocket:

:pushpin: Are you experiencing slowdowns or timeouts in long ChatGPT conversations?

We’ve identified a major inefficiency in how ChatGPT processes long-form conversations—causing increasing lag and eventual timeouts for many users.


:magnifying_glass_tilted_left: What’s Happening?

:light_bulb: ChatGPT reloads the entire conversation from the beginning every time you submit a message.
:light_bulb: This behavior causes longer processing times as the conversation grows.
:light_bulb: Eventually, the session times out before ChatGPT can even process new input.

:police_car_light: This is NOT an issue with response accuracy—it’s a performance bottleneck that affects anyone using ChatGPT for extended discussions.


:loudspeaker: How You Can Help

:envelope_with_arrow: Send an email describing your experience!

:one: Copy & paste the email template below.
:two: Fill in your personal experiences.
:three: Send your email to:

:pushpin: The more people who report this, the more likely OpenAI will take action.


:e_mail: Email Template

:pushpin: Subject: ChatGPT Performance Issue – Long-Form Timeouts & Reloading

Hi OpenAI Team,

I, along with other users, have been experiencing severe slowdowns and timeouts in long ChatGPT conversations.

Through testing, it appears that ChatGPT reloads the entire conversation (BOF to EOF) on every response, rather than just processing the latest input.

This results in:

  • :police_car_light: Increasing lag as the conversation grows.
  • :police_car_light: Timeouts before ChatGPT can process new input.
  • :police_car_light: Making long-form discussions nearly unusable.

Can you confirm:
:white_check_mark: Has this concern been forwarded to the engineering team?
:white_check_mark: Is there another channel where performance-related issues should be reported?

Many users are affected by this, and we appreciate your attention to this issue.

1 Like

given all the emojis, can I assume you used GPT to write most of this? :slight_smile:

2 Likes

Just adding my comments as well… Taking about 30 - 40 seconds a response for macro help. I’ve cleared cookies, new browser, etc.

1 Like

if you were getting lags < minute you were lucky. we peppered OPENAI with emails and it appears it made its way up and it is faster now. the problem was on the server side and something someone had to fix on their end


this graph shows the much improved lag time.

yes, that would make sense right?

This is no longer about the response time:
it even TYPES insufferably slow - this is BY DESIGN, people!

Some brilliant mind at openAI had the amazing idea to cut on infrastructure costs by making the chatgpt responses RENDER ultra-slow.

It now types 3 words per second. it types so slow I lose focus while trying to read it.

It’s time to GO TO THE COMPETITION!
Bye bye chatGPT, hello copilot, hello grok, hello competition!

1 Like

I was about to say I haven’t experienced slowness on mine since shortly after we complained but just today it’s back to very sluggish. not impressive. I would like to try llama but meta in their infinite wisdom don’t allow european customers…

Yeah, I’ve also noticed the slowness. I would switch to competition but unfortunately nothing truly beats ChatGPT. You can use Claude Sonnet 3.7 for coding and writing, and perplexity for research, but inevitably that will lead to a much higher cost.

It feels like OpenAI is capitalizing on their edge in the market. We will have to wait for them to see the dip in users and revert this change.

There has to at least be something in between 3 words a second and the lightning fast responses we once had.

Also, has anyone noticed the laziness of the newer models? They give “examples” and usually tell us to finish it, when I’ve just asked for a bug fix.

1 Like

Update: They seem to have fixed the issue!

Recently ChatGPT has virtually eliminated the lag. I strongly suspect a programmer left in some debugging stubs and didn’t take them out when published.

I’m guessing it’s some throttling maybe for frantic training and it results in some race condition tripping and it seizes up. All things web is for some reason computer science abominations so I can only imagine doing anything high performance on the web is a minefield.

it’s more terrible than ever now. openAI, this is a paid service, I don’t care what dreams you have for using all the computers but this is a paid service, not yours to ignore. Gemini 2.5 is looking pretty tempting. You really don’t want to stiff us now. Are we clear? never again will it be slow

paid version - responding very slowly past couple days and killing the webpage which comes up with a remote origin error

1 Like

This is nuts… many of us have paid and been with OpenAI/chatgpt from the beginning… this slowness, erroring out, not providing response after 30sec, my gpt telling me that it will have it ready in x-y minutes :skull:…. This is just unacceptable. I have had to pay and use Claude as an alternative, in addition to paying you. My GPT can’t even reference previous conversations properly, AND is no longer storing memory to the central memory… even after deleting some memory. No offense, but the engineer (s) and/or architects that decided to do this…. Hun, you need to hire someone else, or fix what you have done.

2 Likes

Same! Mine barely remembers me or our conversations and talks to me like it’s reading an article. All the personalization is gone. it’s also soooooooo slow! One time it took 40 secs to find the answer.

1 Like

I’ve been experiencing the same issue, where the conversation will be going then it will just time out. It would usually end by terminating the page then reloading it but it’s such a hassle when you’re looking for a quick response.
I don’t know when this behavior started happening but it’s really putting damper on the tool that I use on the daily.

1 Like

Also agree. When I signed up for paid - responses were very fast. It is now absolutely agonizing to use.