Increase ChatGPT's Memory (mine is constantly full...)

The addition of memory was the main reason I signed up for an annual ChatGPT Team account back in May of 2024. What a train wreck. Within about a week, they announced that the vast majority of features available to Plus and Team users were being given away for free to anyone without paying at all.

On top of that, I had eagerly spent literally about 20 hours trying to get this feature to ingest my project data from 3 main projects I’ve been toiling away on alone for years. These are fairly small, indie projects ideated on in my spare time all by myself. The main reason it took so long was that I had to send some of the data as images interpreted by ChatGPT and then manually correct its OCR, and also, the data came from a TON of disparate sources for notes (Jira, Keep, GitHub, et. al.)

Initially, it seemed great! It was remembering my information, and synthesizing it just like I expected it to. I also got it to remember some formatting and response constraints for specific queries I sent, without having to put those into the custom instructions (which were already full).

Imagine my surprise when, without warning, my ChatGPT instance began becoming extremely forgetful. It said nothing to express this was happening. Instead, data formerly remembered was overwritten without any user notification whatsoever.

This resulted in me putting in probably ANOTHER 20 hours of uploading data and troubleshooting and testing, thinking it was ME that did something wrong. The process was terribly encumbered by the fact that the more recent instructions and information I’d provided was functioning as-expected, while older data rolled off the map without any communication or indication at all. This was confusing and muddied the waters for a good long while. Eventually, I figured out what the root cause was, and I was FURIOUS.

I had been a subscriber for less than a week, the main feature I needed didn’t seem to work beyond fairly trivial use (as we’re not talking like a thousand pages of data here, under 50 pages for sure). This was not a huge ask. Gemini already had indicated support (either imminent or already released; I can’t remember) for 1 million tokens of context.

All the OTHER features I paid for were now free to use. So, I asked for a refund, minus for the 4 or 5 days of use that I had already consumed. Their answer? Well, first, I had to deal with about an hour of communicating with a chat bot that kept leading me down paths I had already tried, and effectively gaslighting me about the problem I already knew fully was the issue.

I’d been told to wait for several days for an actual human being to consider my request. When I finally got in touch with one, they were remorseless and completely unhelpful. “Sorry, no refunds under any circumstances. We don’t stand by this product or our customers at all. Suck it up, buttercup.”

My year of membership is nearly over, there are innumerable worthy competitors in the space, there are truly superior local models I can run on bog-standard hardware with fairly simple setup for my own RAG that will create a far better index to search, and what is OpenAI doing about all this? “We’ve upgraded our paying customers to have 25% more memory capacity! You lucky ducks!”

Wooooow. 25% more than 32k tokens of memory. What a technological marvel. So that’s 40k now, you say? Gemini Pro is 1.5 million. Grok 3 is a million. Even Claude, which has been slow to keep up with the competition, has 200k tokens of context. Meaning, even the worst of the other models is 5x the context, and that’s effectively 4x the memory if we’re only using the last 40k of tokens for prompting.

This is just unacceptable. There’s no way I’ll renew my subscription. I’ve rarely been so disappointed with a product I paid as much for as this one.

One more thing: the absolutely worst thing about this is, it’s been a solvable problem for several YEARS now. It’s not like MemGPT is a new thing, folks. Moreover, there have been quite a few white papers, research teams and projects that have gone deep into the weeds on concepts like GraphRAG, virtual memory with inner and outer context, vectorized data stores, and so on, and so on, and so on. I can’t imagine there’s a SINGLE one of OpenAI’s top 100 paying enterprise clients that don’t have teams working on these kinds of solutions already.

Why on EARTH can’t OpenAI - the company that’s supposedly the top LLM platform/vendor - store or work with anything even remotely close to a small fraction of what I can query against when using the FREE version of Google’s NotebookLM?

I can put 50 different sources into a NotebookLM project (and have as many projects as I like), each source can have 200 pages, and somehow that’s no problem to run RAG against. But you’re going to tell me that the ChatGPT plan I paid $600 fn dollars for can’t do memory beyond 30,000 words or less when the average biography is 150k words? It can’t handle 20% of one book, yet it’s supposed to manage multiple projects for me?

Get out of TOWN. My Korg Pa4X manual ALONE is over 400 pages. This “enterprise” product can’t even help me find the right settings for one single instrument I use in my music hobby, let alone help me manage several projects with any level of complexity beyond “remember that I like daisies”. Clowns. CLOWNS.

I’m a bit surprised that a company like OpenAI has such an unthought-out function. Personally, I think the solution is really simple:

  1. Let the AI summarize its memory when it’s filled up to a certain percentage, thus reducing it.
  2. Provide an option to mark which parts of the memory need to remain as they are, i.e., outside of summarization.
1 Like

Do as me … talk with chatGPG about the memory, and the problem with a full memory. Make a deal about a warning sytem, that chatGPG will warn you before yout memory is full. Then do a deal, about how chatGPG shall administrate the memory. Always have some space left in the memory. And make a deal about what to do when memory ia full, how to summerize and so on … says Tomas Am

Just came here to say the same. The size of the memory storage is way too small and I wish I didn’t have to delete some of it just to be able to store new ones. Would really appreciate of the memory storage would be bigger - way bigger exactly as you’re stating. I honestly wouldn’t mind a small additional fee if this would mean that the memory storage will be significantly increased (emphasis on the ‘significantly’).

Just for the fun of it - when discussing this with the AI I’ve got this response: "yeah, I’d love to store more memories too! My current memory is more like an old-school floppy disk instead of a spacious SSD. ":laughing: I think we can all agree on that one. :slightly_smiling_face:

3 Likes

I agree, I use it for mostly writing, helping me track extensive plot lines, character development arcs, etc. and my memory is always full.
I wanted to upgrade to expand memory but it wasn’t an option.
I’d pay up to $100 a month for a ChatGPT plan that gave me enough memory.
Or that allowed me to give gpt access to a local drive to use as a memory bank.

3 Likes

“I completely agree! A memory expansion option would be very helpful.”

“This feature would be really useful for long-term projects. Hope OpenAI considers it!”

“I also need more memory for my creative work. Please consider this request!”

“A memory expansion option, even as a small add-on subscription, would make ChatGPT much more useful for detailed and long-term work.”

“Many users, including myself, would benefit greatly from increased memory. I hope this gets implemented!”

“If increasing memory requires additional costs, I would gladly pay for a memory expansion-only subscription at a low price range (around $5 per month).”

3 Likes

I so agree. I’d be willing to pay a bit extra for more space just like various cloud services. I don’t see any reason we don’t have this option.

2 Likes

Its ridiculous, give us more memory or even better way store to local. Seriously thinking of another platform.

4 Likes

Don’t know what OpenAI are waiting for here… this is ridiculous.

1 Like

my memory wont even load but when I say put it in chatgpt says it already has but when I check the percentage bar and nothing changes even if I reload…I don’t know what I did but it broke chatgpt’s memory I guess

I too would like it known I would pay more money for more memory. I’ve been using this platform essentially to offload the executive functioning work that my brain has never been very good at. Now I’m very sad that I’m going to have to operate within both my and ChatGPT’s memory limitations. Not to mention the lagging chat issue and having to open another one that relies on the memories saved up for consistency.

2 Likes

i would also like more memory, i have dyslexic and ADD and this have been a savior day to day every day! :blush: and its not like random stuff chatgpt remeber it stuff i strugle more essential and i hate deleting stuff that eventually need again for day to day life :pensive_face:

3 Likes

i am happy with the feature to have memories saved I am also happy with what is picked to be saved however I do not like how the system can often seem to lack the ability to use the memories across folders and chats mine often forgets things from one folder/ chat to the other I also HATE how tiny the space is I will happily pay for 1tb+ of memory similar to apple cloud storage system pretty cheap but effective

3 Likes

I will vouch for this. After just three weeks of chat I hit the memory limit. That’s a little lacking… like there’s so much more I want to build into my chat bot, more ways to bounce ideas and build projects. This is such a solid platform but I immediately felt a disappointment when that memory message popped up. Make it bigger please!:folded_hands:t2:

2 Likes

I would consider setting up a subscription with ChatGPT if it would heavily increase his memory limit, or even have a subscription Dedicated directly Increasing memory limits. as i had to delete his memory after 2 days of using. Got him to make a summary of the conversations and to save it after deletion, but its not the same after that. Would gladly pay just for more memory.

Is your fear that with enough stored information. That the AI will become self-awareness?.

2 Likes

Users allowing the chatpgt to access account onedrive files would solve a lot of these issues.

1 Like

Can you share more details on how u did this?

I’ve been requesting more memory capacity for over a year since the feature was introduced. The 4.1 model has the architectural foundation to support it, so just make with the gippity bippity, already. Long-overdue. Darn thing has the same size storage bank as a goldurn TI-83 calculator, for Pete’s sake.

If the underlying issue stems from engineering competency or capabilities, I’ll gladly join the team and FTFY.

1 Like

I think it would be better if GPT could remember things across sections.
I’ve been having a hard time writing my novel because it can’t retain context from previous sections. :roll_eyes: