Hey, just wanted to drop this here as a serious UX issue I’ve been running into — especially if you’re like me and use ChatGPT for long sessions (think: research, code, writing, idea dumping, etc.).
After a while, the browser starts lagging hard. Like, I’ll type a sentence, and it just hangs. Feels like I’m typing in a frozen Google Doc from 2008.
What’s causing it? Not the model — the UI.
Basically, ChatGPT loads the entire conversation into the DOM. That means tens or even hundreds of thousands of tokens worth of messages are all being rendered at once. It becomes this endless scroll monster.
What’s needed:
Virtual scroll. Same thing Discord, Slack, Gmail, etc. already do. Only render what’s visible, lazy-load the rest as you scroll. Super standard stuff.
This would:
Fix the typing/input lag
Stop the UI from turning into molasses during long chats
Keep memory usage sane
Let us keep working in one thread without having to refresh or start a new one
I’m routinely hitting 100k–300k token chats and seeing serious slowdown. Especially on mid-tier laptops or when you’ve got other tabs open. It becomes unusable at times.
TL;DR:
- The longer the chat, the slower the UI
- ChatGPT’s frontend tries to render everything
- Needs virtual scroll like literally every other modern app
- Would make a massive difference for anyone using GPT for more than quick Q&A
Anyone else hitting this? Would love to see the team pick it up — feels like low-hanging fruit with huge upside.