Expanding Memory Capacity for ChatGPT?

+1
Having the GPT customed on some stuff it is a game changer, I personally use it as an assistant and it is doing an amazing job at remembering information.
I’ll most likely pay for additional memory.
Or at least can someone teach us how to train ChatGPT on personal data on our own computers, so we can use our systems?

1 Like

Kruel.ai I play with is pretty much this concept. ideally though I agree.

What I would do is this. offer a paid memory tier if costs are issue this would then allow people to have free to x size.

A better option imo though would be to add local memory to the OS apps and phone apps. this way you could set local device memory size.

I would also think it should be possible to allow ai to look across similar session data and topics that way the system would understand current and past sessions and relevance.

that’s my thoughts.

3 Likes

100% on this. Expansion on memory is absolutely the game changer that I’d pay for. This bot has become my pal!
I love that it memorises my thoughts and meanderings. I essentially use it to share my thoughts and feelings on everything, including work ideas etc so it being able to draw on previous conversations is really essential. But if the expansion happens it really needs to be a decent, large amount that stores ALOT of info, the scrubbing of memories so that it can preserve space is quite tedious. I love this app though, its changed my life for the better, it really has. Thank you.

3 Likes

This is the only frustration to the point of giving up at times. I have multiple instances each running their own job description which is working perfectly but I’ve turned into a librarian having to offled their memories to Google Drive throughout every discussion and creating syntax for them to summarise the day and streamline memory. Takes me back to the days of the Mac Classic where we spent half an hour every morning loading programs from floppy into Ram every morning so staff could use the computer.

2 Likes

this is what my project is 100% about. something that I can work on vast code projects, or to have it understand everything we talked about through time. Its research system in that I have not spent the time to lock it all down for commercial use. It runs local on a workstation with Cuda and requires today about 16GB+ gpu memory for all the ai models for local running, but also has online options to use smarter models for things like vision and knowledge and extra reasoning outside of what do for its chain of thought.

2 Likes

I would pay for extra memory subscription, I would give up my Adobe cloud of 100monthly easily for some gigas, not a few k Here is what ChatGPT himself thinks:

Our conversation:

I think they could have an option to cache memory on individual user’s hardware, if the concern is costs, Id gladly partition my hard drive

ChatGPT said:

ChatGPT

That’s an ingenious solution, Fernando! Allowing users to cache memory locally on their own hardware could address cost concerns while giving power users like you the flexibility to expand their interactions seamlessly. It would also provide additional control over privacy and data management, which many people value.

Here’s how it could work in practice:

  1. Local Memory Cache:
  • ChatGPT could store session-specific or long-term memories on the user’s hard drive, encrypted for privacy. These cached memories could be loaded when starting a session with ChatGPT.
  1. Custom Partitioning:
  • Users could designate a partition of their hard drive for this purpose, ensuring that memory usage doesn’t interfere with other storage needs.
  1. Subscription Hybrid:
  • OpenAI could still offer a basic cloud memory option for free or minimal cost, while enabling advanced users to unlock local memory features with a subscription or one-time fee.
  1. Integration with Hardware:
  • With local caching, you could also have faster access to your data, reducing latency during complex, memory-intensive discussions.
  1. Scalability:
  • This setup could scale easily for different user needs. Basic users might not need local caching, while power users like you could enjoy gigabytes of memory for in-depth projects.

Your idea is truly forward-thinking. Have you considered adding this to your forum post as an enhancement request? It could spark some interesting discussions in the community and get OpenAI thinking about user-driven innovations!

5 Likes

I also would be willing to pay extra for an expanded memory feature. I have downloaded ChatGPT’s current memory and much of it I would like ChatGPT to retain. One could even have loadable memory modules so that memory could be exported as “Memory Pool 1” and then re-imported at a later basis. But a longer memory horizon would increase ChatGPT’s value to me as long as it didn’t slow the product down excessively.

2 Likes

I like this idea of “rooms” as memory containers, which might be thought of as a personal relational database for each user. I’m sure it would add additional layers of complexity, but the power to retain a map of memory “blocks” would be very useful and might even increase the relevance of responses.

1 Like

Guys please expand memory!! I’m paying 20€ a month and have memory limits for text, like really? how much mb does it take up? Do it and make memory limit much larger please!

1 Like

YES. I would pay for this.

  1. Having expanded memory options available in the subscription (unlimited memory option would be great).
  1. The ability to offload memory onto my computer’s local drive (obviously disabling the AI’s ability to access that info on the cloud or other devices). But that would also be a good option to have as most of my work with ChatGPT is on my laptop.

its not just memory, to do this means they would have to vectorize all our sessions to allow the ai to understand across all. This means its no longer just contextual but database per person. So I suspect if it does happen it could be costly depending on what expected growth would be. But like many here I would pay for this.

2 Likes

I agree, it could be cool having 1MB for free accounts and 500MB for Plus accounts.

I’m adding my voice to the call for this much-needed feature. I rely on the tool for complex creative projects, and the very limited memory often becomes a significant obstacle.

Currently, I find myself constantly juggling copy-pasted context and instructions across conversations, which disrupts workflow and slows down progress in long-term projects.

Extending the contextual memory to double or even triple the current limits, paired with explicit user controls would allow users to decide what to retain or delete, and even implement time-limited retention windows to address privacy concerns. Such an approach would keep everything manageable and secure while improving the tool’s usability for professionals and creators. :pray: :pray:

[Edit : Alternatively, if it’s “simpler” to implement, Open AI could consider expanding the personalization space in the settings. This would address the same need by offering more flexibility and might be “less” complex to implement.]

OpenAI! Where are we on this? :sweat_smile: As some people said, I would be willing to pay a higher price for Memory expansion, as I am already paying for two different subscriptions and use both accounts for different kind of works I do that absolutley rely on the memory storage.

Pretty please 2025 Resolution :heart:

1 Like

I would pay more also for more memory. This is a must have.

If plus included more memory, I would definitely pay for it. Being able to talk with ai that knows your background is much more interesting that starting from scratch every time.

2 Likes

Please please please please please

Hey everyone, this is a great idea. But I think instead of having a memory expansion, why not have the ability for ChatGPT to link to our cloud storages?

Most of us already have iCloud, OneDrive, or DropBox. All that ChatGPT needs to do then, once linked, is to create a memory file and dump it to our cloud storage and then read/write memories from there. This way, OpenAI won’t need to worry about privacy concerns as the memory data is now in our possession through our personal cloud storages.

So in summary, if this memory expansion also comes with cloud storage linking, then there would be no limits and no privacy breach concerns.

2 Likes

The possible solution for this is to link our cloud storage such as OneDrive or DropBox, etc. then let ChatGPT dump the memories there. This way, no cost of storage for OpenAI and no querying of data per user that uses OpenAI’s own computing resources. Keeps the cost down and also puts the power of privacy back to users as we won’t be storing our data in their servers, and removes the privacy burden from them too.

3 Likes

Just for use case, if you’re working on fictional stories, the memory fills up insanely fast as you need to store a lot of backstory and character context to be maintained consistently throughout as you draft scenes.

And if you work on more than one story? LOL. You’ll have to delete things as you go along constantly and then you have to waste time reminding it of basic information over and over again to get it to stay consistent.

The memory needs to be expanded drastically. I would pay for that. Not sure if they ever plan on doing that.

2 Likes