I’m experiencing significant performance issues with the ChatGPT PLUS web browser version. (Paid version)
Slow Loading:The initial page load takes an excessively long time.
Delayed Input: There’s a noticeable delay between typing my message and ChatGPT acknowledging it.
Slow Response Times: ChatGPT takes an unusually long time to generate responses, often 20-30 seconds or more.
Unresponsive Interface: The chat window frequently freezes or becomes unresponsive during these delays.
Slow Chat Switching: Navigating between different conversations within the web browser is also very slow.
These issues occur regardless of the browser I am using (e.g., Internet Explorer, Google Chrome), indicating the problem likely lies with the web browser version itself and not with specific browsers.
Interestingly, I don’t have these performance issues when using the ChatGPT phone app. The app provides a much smoother and faster user experience.
I’m seeking assistance from the community to see if others are experiencing similar problems and to gather potential solutions or workarounds.
I’ve been struggling with insane lag times for about a month. I use the AI to help me with writing student IEPs and have worked with it on regulations and formatting specific to the software used in my school district. I think I’ve hit a point where my thread is simply to long for the AI to handle. I’m wondering if I need to switch to a higher premium tier (currently at $20 month? plus?) or start a new thread for each student and then import a key points brief for the bot?
Also I’m way beyond the 5 seconds of lag you refer to. I’ve tried clearing cache and memory on both the Computer and my ChatGPT account. Maybe one of the other GPTs may be best for my use profile?
I found out that for me it was the long chat thread. Basically I was using the same chat thread for most of my questions that it got too long and would take time to refresh. Once I opened a new chat the issue resolved.
So ChatGPT plus or not, at some point ou have to break away into anew chat.
I’m also experiencing this issue, not just in the web browser but also in the standalone Windows app. It becomes increasingly slow, unresponsive, and even freezes as the conversation history gets longer. This makes it difficult to continue using the same chat without having to start fresh, which isn’t always ideal when working on long-term projects or discussions.
It seems like the core problem is that ChatGPT tries to load the entire conversation all at once, consuming too much memory. Many modern communication apps solve this by implementing “lazy loading”, where only the most recent messages are loaded first, and older messages load dynamically as you scroll up.
Has anyone found a workaround, or is there any indication from OpenAI that they plan to address this? It would be great if the devs could implement a similar system to improve performance.
I agree. When the conversation reaches a certain depth, the Browser chat interface seriously slows down to the point it is unusable. If you can afford to lose the current context, the simplest solution is to start a new conversation. If context is essential, two “workarounds”:
Wait for the first answer token to appear, ensuring your query was received, then press F5 to reload the web page. If it is faster than waiting for more tokens to appear. Certainly not an OpenAI-endorsed solution since I suspect this creates even more load on their servers.
Ask ChatGPT to generate a summary or transcript of the current conversation that captures the most important elements of the current context. Download and review that file to ensure it contains the critical bits of information you don’t want to lose, then start a new chat instance and ask as your first prompt to load that context file. In my experience, many details are lost that way, but the great lines are here, and you can resume the conversation with some context.
The best solution would remain a fix by OpanAI, of course!
Potentially related to [tt]issue-with-conversation-duration-limit[/tt]
I feel it’s much worse today as well, my guess is they implemented something else that is using even more browser memory and hence makes this way worse.
I thought it was just me having this issue. It’s been driving me nuts. I tried 3 browsers on PC and even on laptop it was the same. I don’t really use the cell app as I need the info I get on the PC.
Greetings. I have done it all. Clear the cache and cookies. Summarize the conversation because I do need context to remain the same and move to another conversation. Used different browsers and different devices. The response time is appalling. On my phone? Absolutely fine. Otherwise, it can take over a minute and many “the page is unresponsive” errors until it finally replies to anything. Is anything being done to sort out this problem? And no, it is not a problem of conversation length because it happens after I open a new conversation, ask two questions, and it starts stalling as well. Has anyone found a solution for this?
The lag times are now horrifying after only a few questions. It’s the JavaScript code re-rendering the whole page for each new syllable flowing in as far as I can see – that’s when someone at openAI has to grab the fastest machine in town not and fix it.
Bought a $9 stress ball off Amazon to use in between long ChatGPT wait times… didnt help. Its unbearable. 1-2 minute wait times, only to get half the response, 40 seconds later i get the rest. Sometimes I get everything but it just disappears. And it seems like if you minimize the app window or it goes in the background, when i go back to check if its done, it always errors with the red information and have to click retry. Happens 90% of the time when chatgpt is thinking and it gets minimized or pushed behind windows.
ChatGPT PC App
Win10 Pro / i7-3930k / 32GB ram
I am organized, keep everything in Projects, only 4-5 projects. Restart the app every hour or so.
the chatgpt web app becomes unresponsive in long conversations. if you open a very long chat, it will even reliably crash the browser tab. it’s driving me insane.
as a web developer, i can tell you that the problem is simple: they are re-rendering the entire conversation (from the start, even what you cannot see) every time some tokens come in… this is an exceptionally rookie mistake, an amateur oversight that is hard to explain in any production chat app.
it’s even more astonishing they haven’t noticed or fixed this for such an incredibly long time.. it would be straightforward to fix in an afternoon or two, so it’s a little perplexing.
the only good reason i could imagine for them avoiding fixing this issue (and others), is that they’re simply working hard on an all-new version of the web app that will wholly replace this current version soon enough – in which case any effort spent on this old version we’re all stuck on would be a waste of their development capacity – and the new version is probably overrunning its deadline, and we weren’t supposed to be using this buggy version for such an extended period of time… just speculating, hoping!