While using chatgpt 4 model from few months I observed that the chatgpt uses client machine resources to analyze the chats in a chat session. It does not use much of a ram when the chat session is new and has less conversations. But when a chat session has pretty long conversations then the ram usage keeps on increasing. That too even if you are not doing any conversation with the model the ram usage is constantly high. If you open the chat session with long conversation history then also the ram usage spikes up instantly and the browser tab starts to hang.
From the usage I think that continue doing conversation in the same session will eventually increase the ram usage. In the screenshot as you can see that ram usage is pretty high even when it is active without me being prompting to the model.
In my case it went up to maximum of 2.2 GB of ram when the model starts to process and generate the output and my laptop has only 8GB of ram.
Even if I or someone else try to continue the conversation in a new chat session by providing the older chat session shareable link to the model for reference to the past conversations. But honestly that does not work well. I have tried.
So, I think that developers should do the processing of the conversation on their servers instead of the client machine. They should implement it as soon as possible because I think that the ram usage keeps on increasing with more conversations in the session with the gpt4 model.
Let me know if someone else is also facing this issue. If someone is able to found a solution to this problem then please share.
This is happening with me also, any solution for this?
My browser is freezing constantly, no way to use it.
The browser only freezes when using chatgpt.
Same issue, the longer the conversation the more often the tab crashes or hangs. I’m developing some code and can’t just open a new conversation, as it would lose all context and I’d have to explain everything all over again, making the conversation again longer and longer and end up at the same point. We need a fix for this maybe by limiting how far back it processes the conversation, as now it’s not that important anymore to know context from the very beginning of the conversation but from the last quarter or third.
You need to stop relying on the context of the conversation history that chatgpt get. The larger the conversation the larger the context size. Remember that context window also has a limit. So, if you are developing something then you need to breakdown the development into small parts and summarize content as much as you can. If you have a code or a class which is doing something then you can simply give its explanation in few words. If it is a complex class or file and is under the your chatGPT plan input limit. Then copy paste the file code and tell ChatGPT to analyze it. Then in the next message tell it something else and slowly slowly you will be able to build your code in parts. ChatGPT is not efficient in remembering things and it has limitations. So, try to keep things seperate and short.
Same here. Every message generates so much data in RAM… this is ridiculous. It depends on the message length, but they can take even up to 800mb RAM for one response. I noticed it when my tab with chatGPT was frozen, and Firefox informed me that this tab is slowing my PC. I looked into task manager and this tab took over 5GB RAM. I have to refresh the site after few responses to release some data, but it should not work this way…
Same, I have been doing some python coding and I noticed over the weekend it would crash my browser or tab. Today it is almost unusable. I agree i hope they get it fixed.
The issue doesn’t lie within the current conversation you are having. Instead, the problem is caused by previous chats (displayed in the left tab) that ChatGPT has not optimized to minimize memory usage. As a result, the more chats you have stored, the more memory any new conversation will start off with.
For someone like me who uses ChatGPT daily for work and has accumulated hundreds of chats, it has become nearly impossible to continue a conversation without encountering high memory usage. Google Chrome’s maximum tab memory is around 2.2GB, and reaching this limit causes significant slowdowns and potential crashes.
Currently, even a brand new, empty chat session takes up around 350MB of tab memory. This indicates that the memory footprint of previous chats is substantial and affects the performance of new conversations.
OpenAI needs to optimize the memory usage of previous, unopened chats to prevent them from consuming excessive resources during active sessions. This optimization is crucial for users who rely on ChatGPT for extended and frequent use.
As a temporary workaround, when you notice the chat slowing down and sense an impending crash, you can open a new tab and continue the same conversation there. This approach resets the tab memory without losing the conversation. However, this is only a temporary fix, as the tab memory will increase rapidly with each new prompt.
In summary, optimizing the memory usage of previous chats is essential to improve the overall performance and user experience. I hope this information helps both users understand the issue and developers at OpenAI identify and resolve the problem efficiently.
My System Specs:
-Windows 11 Pro
-Nvidia 4080 GPU
-Intel i9
-64GB Ram
I think you’re on to something here. I archived most of my older chats and it did not seem to help the situation. I have deleted old memories and even turned off memory. No noticeable difference.
Edit: I tried this using a free account (non-subscription) the Tab grew to over 1.1 gig with 2 conversations and small (100 lines) python code as I asked three follow up questions.
Hi there,
I am having the same issue,
Browser Tab started using over 4 GB Ram, I remember having this issue months ago and got fixed after coulple of days. Now the memory leak is back again since 27.05.2024 and yet not fixed ( 30.05.2024 )
Honestly, it surprised me as well. I only started experiencing this issue yesterday. The only difference in my workflow is that I’m primarily using GPT-4o.
I’m hopeful that this is a simple oversight by OpenAI’s developers and that they will patch the issue soon.
You bring up an interesting point that could indeed be a contributing factor. I have observed that memory usage spikes significantly when ChatGPT utilizes its Analysis feature or writes Python code.
For instance, I was working on a system architecture document yesterday, and the memory issue became evident as soon as ChatGPT began coding an example implementation in Python. This might suggests that these specific features might have become more resource-intensive, exacerbating the memory consumption problem.
For anyone using ChatGPT to help with code or data analysis, be more mindful of potential memory spikes during intensive tasks like these.