Refresh is in fact the only option to survive. And not refresh, copy the URL from the address bar, kill the browser tab, open new browser tab, paste the URL and wait. Then when you think the response is probably finished, hit enter.
Thinking is even reasonably fast to watch. The problem is when the main response comes, then the browser will just enter an insane rendering and DOM recast nightmare that will bring everything to a halt. Developer cleverness and framework nonsense (REACT) bringing everything to a screeching halt.
I have a proposal: use a light weight java script code that receives the response and just casts it node by node onto the DOM, no matter how large it grows, there is ZERO need to recast earlier messages. This is all artificial nonsense by the browser app trying to be clever and obscure so that we can’t script it ourselves.
It definitely doesn’t seem like a problem of long contexts, or RAM memory, or cache. It is a fundamental problem, since if it were one of the problems I mentioned above, in the Android or Iphone App it would not work so smoothly, because in each query it also locates the context of the conversation.
Chat GPT can help solve this.
The structure of the html for chats looks like this. If they add collapsing to the articles in html and don’t fill those components when rendered, rendering won’t have these long timeouts
Same issue here, response times on desktop/Windows are excruciatingly slow, this happens on all browsers regardless of any add-ons, number of tabs open, CPU/memory resources, etc. The only solution is to have super short chats - restart the conversation after every few exchanges - which is absolutely not practicable, given the complexity of some topics.
I’m ending my Plus subscription before the end of the month and I’m definitely not paying again for this service until this issue is solved.
Others have adequately described the issue. I don’t have much new to add. I understand its a react web rendering issue but I cant believe its broken for so long. I am only writing to express my dismay, surprise and disgust at open apathy by OpenAI to this issue. No other Chatbot has this issue. OpenAI if you are still listening somehow - you are going to die unless you fix this issue soon. Sorry.
For what is it worth - here is how I see the issue:
When I have a long chat going on with ChatGPT on web - it slows to a crawl when its time to show output. “Thought for xx s” then a long hang. Same output is available immediately in iOS iPhone App. So now I type into ChatGPT on the Web (because its so much easier on the desktop) and look for for output on iPhone. Issue affects every browser, every OS and Desktop APp too.