Since the issue has persisted for quite some time and I find the new model/personality updates horrible, I’ve started exploring and discovered Claude AI. I’m currently testing it, but so far there are many aspects that I like significantly more than with ChatGPT. It takes a little more work to get the same results in certain cases, but I like the reasoning more, it’s a bit more creative and I find it to be closer to my preferences.
I am a ChatGPT Business subscriber and I’ve used it extensively over the last year, but I really don’t like where it’s headed and I don’t want it to agree with me and keep the conversation going, but rather help me identify flaws and establish creative strategies/approaches.
I also hate it that ChatGPT appears to suggest using third party products or provides void answers instead of suggesting technical approaches. I’ve tested it with several problems and it tends to establish a loop of “buy X” or “use machine learning” until I ask it explicitly about techniques and strategies to use in a specific situation. Claude, on the other hand, mentions the generics but touches on the deep dive, too.
Hi everyone
HERE IS A SOLUTION/WORKAROUND THAT WORKS
-
(Optional) If you can and if not already the case, create a project with a system prompt that will give ChatGPT a lot of context to continue your discussion on the same tone, expertise, etc
-
(Optional) If you can and not already the case, move your long/blocked/laggy discussion inside this project
-
While being on the long/blocked/laggy press CTRL+A or CMD+A to copy the whole content of the page
-
Open Word or Google Docs, whatever
-
Use CTRL+SHIFT+V (and NOT CTRL+V) to paste the text without the formatting
=> You should obtain a very long Word document with the whole text dump of your blocked/laggy conversation
-
Remove bits of text at the beginning and end of the document that is not part of the discussion (you will see other discussions’ names, etc)
-
Save your Word document in a folder (Cloud-linked if possible) and name it something like mysuperdiscussion1.docx
-
In the ChatGPT project (if possible, otherwise just in your convos) create a new, blank discussion
-
Upload you .docx into the new discussion and say to ChatGPT something like “Based on this document please continue the discussion”
IMPORTANT : Remember, in time, to keep all the docx in a folder, named mysuperdiscussion1.docx, mysuperdiscussion2.docx etc
You will be able to give them all back to ChatGPT each time you start a new discussion to replace a laggy/long one. It will act as the “memory” of your conversation
And voilà ![]()
Oh man thank you sooooo much I was looking for a fix for this for so long man, again thank u soooo much bro
Same issue here, I’ve been, continuing long conversations in new chats for that reason. I think the cause is just the sheer amount of HTML the browser has to render on long conversations, with all the pretty formatting and such. If the actual conversation was not loaded fully in the browser but instead would use “load more” feature, I doubt this would happen.
same i have a chat for chatting DSA and CS stuff, its pretty long as I have been using last 4 semesters, its a breeze for the IOS app but on laptop, it takes 10-15 seconds to load, and for a new response it doesn’t stream the response it just freezes the whole page and I check back 1-2 mins later and there would be the complete answer
I’ve been experiencing memory retention problems for at least the past week, from after 10th April. Unfortunately the problems have become progressively worse ever since. I’ve reached out to support 6 times already this past week (via email) but keep getting fluff replies and no help, escalation route or assistance. I’m rather cross to be honest, as I’m a paying plus user and they wont even help me try to resolve it.
To improve getting help, if this is a problem others are seeing, is to post your problem as new topic with details so that others inclining the OpenAI staff can recreate it.
If others on the forum see and/or can recreate the problem then they will most likely chime in with a me too reply in your new topic.
If the topic then explodes with others noting the same it will get noticed and passed up the chain.
As it currently stands you are just adding to an existing post and often these are not noticed.
HTH
Is this a Chrome-related issue? Which browser best supports long chats? This is really frustrating if you have a long chat and you cannot really get out of the chat. Even writing anything in the chat box is extremely slow and does not respond most of the time.
it keeps saying load failed or giving uncorrect answers
This is still an issue. I believe it’s because of a number of nodes in the DOM, which can get relatively high for longer chats.
Instead of making the whole conversation visible, the frontend should lazy load chat history. I hope a frontend developer sees this.
This only affects web app.
I’ve experienced exactly the same issue.
Few workarounds that I’ve found so far.
- For smaller discussions I’ve created small chrome plugin that removes all the nodes. It helps until some extent but not for long
- For longer discussion approach #1 didn’t work. The issue is that JS framework still renders all the messages and it is very resource consuming. I’ve ended up creating a thin JS proxy which using node js. It is returning only N last items (let’s say 30) to the page and chat is still performant.
I hope that chat gpt eventually will push their devs to implement lazy loading which will sort the problem once and forever.
I dont know why they don’t fix this issue!!!
This is a really big problem for heavy users. It honestly defeats the purpose of paying for premium! This is not good.
I have a simple solution now which is to delete unnecessary history from the dom.
just copy and run this script in your browser console(change the number what you want to keep)
function keepLatestConversations(countToKeep) {
const allConversations = Array.from(
document.querySelectorAll(‘[data-testid^=“conversation-turn-”]’)
);
// Extract and sort by testid number in descending order
const sorted = allConversations
.map(el => ({
el,
id: parseInt(el.getAttribute(‘data-testid’).split(‘-’).pop(), 10)
}))
.sort((a, b) => b.id - a.id);
// Keep only the first countToKeep, remove the rest
const toDelete = sorted.slice(countToKeep);
toDelete.forEach(({ el }) => el.remove());
}
// Keep only the latest 10 conversations
keepLatestConversations(10);
I tried doing it manually and with your scripts… and elements are deleted but does not solve. Maybe something else in the DOM?
It’s working well for me. I noticed that when I directly copy the text, the quote style changes and causes a syntax error. If it happen to you, try fix the quotes.
This bug is just something… The Windows app doesn’t allocate enough RAM to process the chat, and there is no lazy loading. In the web version, even if you clear previous messages the speed doesn’t increase as it’s all already loaded into memory (which means you have to refresh the page, making clearing useless). I could fix this in half an hour to an hour, but OpenAI releases a bunch of low-value application updates that don’t solve this problem. Please finally add “lazy” loading of chat messages, it’s so simple after all, it’s not even comparable to the tasks you’re doing for AI development. Thank you!
This whole issue is pervasive and it gets worse quicker the more I use the site. I have a Plus account and threads become unusable after only a day or two.
Completely unacceptable, and it’s going to force me to end my subscription.
The issue is caused by the web developers not implementing virtualized scrolling. So the browser app is just loading every single chat entry and response into the web page, when it’s not even in view. This is the cause.
It’s pretty strange that they don’t add such a basic feature, since any chat box would immediately require such an implementation.
I would guess that the mobile version has virtualized scroll.
Virtualized scroll will just load elements into the DOM when the element SHOULD be in view or close to it. Something way way up the scrollable page won’t even exist until you’re close to scrolling it in.
And someone else said, “to just add lazy loading of the chat”, this won’t fix the issue. If you scroll up it will load everything and then be laggy again.
Virtualized scrolling is the fix

