How to enlarge ChatGPT memory - Memory Full Issue

I am not using API or anything, I prefer direct talking with ui based chatGPT app.
I love that ChatGPT has memory now and every time I tell it to remember something it remembers. But the issue is, it has very low memory and started saying that memory is 100 percent full. Is there a way for me to have more memory I’m ready to pay. Thanks in advance! :grinning:

Please have a look at image

2 Likes

I am not aware of any method to increase the number of memory slots for ChatGPT.
You can try using different tools instead. For example create a custom GPT with a memory file and use code interpreter to make changes to that file.
Then you need to build your instructions when to access the memory for conversations.

3 Likes

This is a great answer I’m really excited.

I have couple of question:
1. what is the max size of file I can upload.
2. What is the number of file It can store.
3. whether custom GPT has any limitation

My use case:
I have some file where I have notes and different task I have completed I want to upload that to CUSTOM GPT and ask questions like:

  1. What were the task I did in 2023
  2. Which note have ‘xyz’ keyword
  3. How were the components involved in X task.

Again thank you for your response :grinning:

2 Likes

Hi vb, excuse me… but do you have any links to documentation as to how to do this? Sorry, I’m new to GPT plus so still trying to figure some things out (and where to find them).
I have a whole bunch of metric data that I wanted GPT to sort and go through when answering questions, but it just doesn’t remember data even when pasted inside it’s own chat window (even tried CSV files too).
I’ve read about using a Repo in GitHub, but not sure how to implement that yet.

Thank you!
Erik

Hello, can a custom GPT make changes in the documents of its knowledge database with Code interpreter ?
I tried it but didn’t work. Thanks.

Try this:

1 Like

I honestly did something very close, and came up with what to put in by asking ChatGPT in different situations I liked its response, what it wished it could remember form each, and bouncing ideas back and forth

I was really sad when I got the first “memory full” note. Emptied it as much as possible, but it was soon full again. So I started discussing this problem with my ChatGPT. Now we have a different file that we use for all the critical information. I just tell ChatGPT for example to save the last 5 comments on the file. I changed the chat thread to test it, it works. I asked how it is possible, and got the answer that it is saved in some kind of vault. I am really astonished on how we solved the issue :smile:

The problem with the memory limitation is both frustrating and carries several potential solutions. Some of them really need to be seriously considered for this platform to remain viable to some groups.

I am writing to propose a potential collaboration between OpenAI and NVIDIA that could enhance the capabilities and accessibility of AI tools for creative professionals, particularly writers and researchers working on complex projects. As an experienced professional with over 50 years in technology and creative writing, I’ve identified a synergy between your organizations’ offerings that could provide transformative opportunities for users like myself.

I am currently developing a speculative fiction novel that integrates historical, military, and scientific realism with speculative elements. This project relies heavily on accurate data and continuity, requiring significant computational and organizational resources. While OpenAI provides exceptional natural language processing capabilities, the limitations of memory and integration with external datasets present challenges when managing large-scale projects.

NVIDIA’s RTX Chat and related tools have demonstrated promise in leveraging local datasets for more tailored solutions. However, these systems currently lack the advanced contextual and creative capabilities that OpenAI excels in. By combining the strengths of both platforms, users could benefit from:

  1. Integrated Dataset Utilization: Allowing OpenAI models to securely access locally stored datasets (e.g., manuals, PDFs, web archives) via NVIDIA’s systems, enabling deeper contextual understanding without compromising OpenAI’s training processes.
  2. Hybrid Computing Models: Leveraging NVIDIA GPUs for localized processing to reduce reliance on cloud systems, while maintaining OpenAI’s advanced language generation capabilities.
  3. Dedicated Pilot Programs: Creating a collaborative pilot program for users with advanced computational setups (e.g., RTX 30, 40, or 50 series cards) to test integrated AI workflows in real-world creative and research scenarios.

Such a collaboration could greatly benefit users who lack the resources or technical expertise to build custom AI models, while still providing them with the tools to maximize productivity and creativity. This integration would also showcase both OpenAI and NVIDIA as leaders in democratizing access to cutting-edge AI technology.

I am prepared to provide additional feedback, participate in pilot programs, and share insights from my decades of experience working with AI concepts (dating back to projects with Arthur Andersen in the 1980s) and creative writing. I currently maintain extensive datasets relevant to my work and am exploring hardware upgrades to support NVIDIA’s latest offerings, such as multiple 50 series GPUs or AI units.

I believe this proposal aligns with both companies’ missions to advance AI technology in ways that empower users and foster innovation. I would be happy to discuss this further or provide additional details as needed.

Thank you for considering this proposal. I look forward to hearing from you.

Mike Cataldo