ChatGPT 4 Slow/Hangs in Chrome Browser

This is still a major issue. We’re not just talking a second or two. You literally sit and stare at your screen for what feels like forever, waiting for a response.

did NOT solve the problem for me. been trying on multiple browsers from Windows and Linux and slow af on all. No extensions.

This problem persists in Chrome.

I am having the same problem … My account is plus and still not working on chrome or firefox I dunno what is going on

1 Like

Same here, I am not sure what’s wrong.

Same here. The issue got worse in the past couple of weeks for me. Looks like OpenAI is pushing people to use chatgpt app instead. This could be due to many reasons, technical and non-technical.

I have the same problem, and why oh why, the problem is the longer the conversation gets the slower it gets, that because the ai is reading the whole converstion with every question. so the best way is to keep it short, with big project, say when making een application, use a different conversation for each part of the project. if you ask chat gpt why it so slow, then you get this awnser.

why is chatgptp getting slower the longer the conversation gets

ChatGPT zei:

That’s not your imagination — long conversations really can make ChatGPT feel slower.
It’s mainly because of context length and how the model processes it:

  1. We re-read everything every time
    When you send a message in a long conversation, the model doesn’t just “remember” in the way a human does. It reprocesses the entire conversation history each time to figure out the next response.

    • If you’ve exchanged hundreds of messages or pasted in big chunks of data/code, that history can get huge.

    • More text in the context = more computation = slower responses.

  2. Token limits and compression overhead
    Models like GPT-5 have a token limit (think of “tokens” as word pieces).

    • When the conversation gets close to the limit, the system starts summarizing or compressing older messages to make room.

    • That extra processing adds delay and can make answers slightly less sharp if the compression loses detail.

  3. Server-side load balancing
    Even if the model is fast, heavy context means more memory and compute usage per request. Your conversation might get put on a slightly busier compute node or take longer to process because it needs more RAM.

  4. Your own browser
    If the chat thread is massive, the web UI also has to render all that text, which can make scrolling, typing, and loading slower — especially if you have lots of images or code blocks.

How to keep it fast:

  • Start a new chat when the current one gets really long (especially if you notice lag or the answers feel less on-target).

  • If you need continuity, paste a brief summary of the relevant info into the new chat so we can pick up where we left off without all the baggage.

  • If you work with large code/data dumps, keep them in external files and link/paste only the relevant part each time.

If you want, I can show you how to create a short “state summary” of your current big conversation so you can restart it in a fresh window but still keep all the important details. That would make ChatGPT much snappier again.

It’s happening on Brave as well…

Why ChatGPT gets slower in long chats (the real reason)

First: What is DOM?

DOM means Document Object Model.

It’s how your browser represents a web page in its memory.
Every message you send or receive in ChatGPT becomes a little “box” inside this structure. All those boxes (messages) live in the DOM.

What’s really going on:

  1. ChatGPT puts every single message into the DOM.

    • Every prompt and every answer becomes a separate element in the browser.

    • The browser keeps all messages loaded at once — even the oldest ones.

    • This makes your browser work harder and use more memory (RAM).

    • After many messages, the DOM becomes huge — and the page slows down.

  2. There’s no optimization like Gmail or Slack use:

    • Apps like Gmail or Messenger only show part of the conversation at a time.

    • ChatGPT shows everything — no “virtual scrolling”, no “lazy loading”.

    • That’s why older messages still affect performance, even if you’re scrolled to the bottom.

  3. Even the desktop app doesn’t fix it:

    • The ChatGPT app is not a true desktop program.

    • It’s actually just the website running inside a wrapper (Electron).

    • So it has the same performance issues as using ChatGPT in Chrome or Edge.

1 Like

I’ve had an extremely frustrating experience working with ChatGPT-5. No matter which browser I use Chrome, Safari, Edge, … it freezes. On top of that, no matter what premises or constraints I configure, it consistently ignores them. I even provide it with data it’s supposed to verify, but it fails to check it and just moves on. At this point, I’ve had enough. The constant freezes, crashes, and inability to follow instructions make it not worth the hassle. I’m switching to other models that actually work reliably.

1 Like

924mb in the active tab…its become nearly unusable at this point. And no, starting a new chat to get back to a speedy response, even with a carryover summary of the og chat, is not a real option/solution. How can we build a really in depth chat on a specific project/brand/niche/etc if this just gets bogged down like this?

What’s infuriating is that we are being told these stupid solutions like “clear your browser cache” and disable extensions, when in reality this is simply the fault of the garbage they used to program this app in such a heavy handed way. The slowness is simply about the conversation getting long and the rendering they use for the messages is extremely wasteful.

You can prove that to yourself simply by hearing your fan spin up, process explorer showing CPU% on browser tab, and the thing even spits out the Wait or Stop alert dialog.