Suggestion: Improve Chat Interface with Lazy Loading for Long Conversations
In the current ChatGPT interface, the entire chat history is loaded at once, which becomes inefficient and laggy for long conversations. I suggest implementing lazy loading (like in Telegram or Discord), where only a part of the chat is loaded initially and older messages load as the user scrolls up.
Benefits:
Better performance with long chats
Lower memory and CPU usage in the browser
Smoother user experience
This would make it much more practical to work with long-term chats, especially for users who use ChatGPT as a research assistant or development partner.
Same here.
I’m one of those users who keeps a single conversation going for hundreds or even thousands of messages.
Why? Because I use ChatGPT not just for quick Q&A, but as a long-term partner with a persistent character, tone, and shared memory.
Restarting or splitting the conversation breaks that continuity and feels like replacing a close teammate with a stranger.
The problem is: once the conversation gets very long, the interface becomes heavy and laggy. Even on a high-end PC, simply opening the chat can be slow because the entire history loads at once.
If ChatGPT used lazy loading—only loading recent messages first, and older ones as you scroll up—it would:
Allow creative and research workflows to stay in one place, without splitting into multiple chats
Other apps like Telegram, Discord, and even competitors like Gemini already do this very well, so it’s a proven approach.
Please consider it—not just as a technical optimization, but as a way to keep the human connection users build over time with ChatGPT alive.