While using chatgpt 4 model from few months I observed that the chatgpt uses client machine resources to analyze the chats in a chat session. It does not use much of a ram when the chat session is new and has less conversations. But when a chat session has pretty long conversations then the ram usage keeps on increasing. That too even if you are not doing any conversation with the model the ram usage is constantly high. If you open the chat session with long conversation history then also the ram usage spikes up instantly and the browser tab starts to hang.
From the usage I think that continue doing conversation in the same session will eventually increase the ram usage. In the screenshot as you can see that ram usage is pretty high even when it is active without me being prompting to the model.
In my case it went up to maximum of 2.2 GB of ram when the model starts to process and generate the output and my laptop has only 8GB of ram.
Even if I or someone else try to continue the conversation in a new chat session by providing the older chat session shareable link to the model for reference to the past conversations. But honestly that does not work well. I have tried.
So, I think that developers should do the processing of the conversation on their servers instead of the client machine. They should implement it as soon as possible because I think that the ram usage keeps on increasing with more conversations in the session with the gpt4 model.
Let me know if someone else is also facing this issue. If someone is able to found a solution to this problem then please share.