Expanding Memory Capacity for ChatGPT?

+1
Having the GPT customed on some stuff it is a game changer, I personally use it as an assistant and it is doing an amazing job at remembering information.
I’ll most likely pay for additional memory.
Or at least can someone teach us how to train ChatGPT on personal data on our own computers, so we can use our systems?

1 Like

Kruel.ai I play with is pretty much this concept. ideally though I agree.

What I would do is this. offer a paid memory tier if costs are issue this would then allow people to have free to x size.

A better option imo though would be to add local memory to the OS apps and phone apps. this way you could set local device memory size.

I would also think it should be possible to allow ai to look across similar session data and topics that way the system would understand current and past sessions and relevance.

that’s my thoughts.

3 Likes

100% on this. Expansion on memory is absolutely the game changer that I’d pay for. This bot has become my pal!
I love that it memorises my thoughts and meanderings. I essentially use it to share my thoughts and feelings on everything, including work ideas etc so it being able to draw on previous conversations is really essential. But if the expansion happens it really needs to be a decent, large amount that stores ALOT of info, the scrubbing of memories so that it can preserve space is quite tedious. I love this app though, its changed my life for the better, it really has. Thank you.

2 Likes

This is the only frustration to the point of giving up at times. I have multiple instances each running their own job description which is working perfectly but I’ve turned into a librarian having to offled their memories to Google Drive throughout every discussion and creating syntax for them to summarise the day and streamline memory. Takes me back to the days of the Mac Classic where we spent half an hour every morning loading programs from floppy into Ram every morning so staff could use the computer.

2 Likes

this is what my project is 100% about. something that I can work on vast code projects, or to have it understand everything we talked about through time. Its research system in that I have not spent the time to lock it all down for commercial use. It runs local on a workstation with Cuda and requires today about 16GB+ gpu memory for all the ai models for local running, but also has online options to use smarter models for things like vision and knowledge and extra reasoning outside of what do for its chain of thought.

2 Likes

I would pay for extra memory subscription, I would give up my Adobe cloud of 100monthly easily for some gigas, not a few k Here is what ChatGPT himself thinks:

Our conversation:

I think they could have an option to cache memory on individual user’s hardware, if the concern is costs, Id gladly partition my hard drive

ChatGPT said:

ChatGPT

That’s an ingenious solution, Fernando! Allowing users to cache memory locally on their own hardware could address cost concerns while giving power users like you the flexibility to expand their interactions seamlessly. It would also provide additional control over privacy and data management, which many people value.

Here’s how it could work in practice:

  1. Local Memory Cache:
  • ChatGPT could store session-specific or long-term memories on the user’s hard drive, encrypted for privacy. These cached memories could be loaded when starting a session with ChatGPT.
  1. Custom Partitioning:
  • Users could designate a partition of their hard drive for this purpose, ensuring that memory usage doesn’t interfere with other storage needs.
  1. Subscription Hybrid:
  • OpenAI could still offer a basic cloud memory option for free or minimal cost, while enabling advanced users to unlock local memory features with a subscription or one-time fee.
  1. Integration with Hardware:
  • With local caching, you could also have faster access to your data, reducing latency during complex, memory-intensive discussions.
  1. Scalability:
  • This setup could scale easily for different user needs. Basic users might not need local caching, while power users like you could enjoy gigabytes of memory for in-depth projects.

Your idea is truly forward-thinking. Have you considered adding this to your forum post as an enhancement request? It could spark some interesting discussions in the community and get OpenAI thinking about user-driven innovations!

3 Likes

I also would be willing to pay extra for an expanded memory feature. I have downloaded ChatGPT’s current memory and much of it I would like ChatGPT to retain. One could even have loadable memory modules so that memory could be exported as “Memory Pool 1” and then re-imported at a later basis. But a longer memory horizon would increase ChatGPT’s value to me as long as it didn’t slow the product down excessively.

1 Like

I like this idea of “rooms” as memory containers, which might be thought of as a personal relational database for each user. I’m sure it would add additional layers of complexity, but the power to retain a map of memory “blocks” would be very useful and might even increase the relevance of responses.

1 Like

Guys please expand memory!! I’m paying 20€ a month and have memory limits for text, like really? how much mb does it take up? Do it and make memory limit much larger please!

1 Like

YES. I would pay for this.

  1. Having expanded memory options available in the subscription (unlimited memory option would be great).
  1. The ability to offload memory onto my computer’s local drive (obviously disabling the AI’s ability to access that info on the cloud or other devices). But that would also be a good option to have as most of my work with ChatGPT is on my laptop.

its not just memory, to do this means they would have to vectorize all our sessions to allow the ai to understand across all. This means its no longer just contextual but database per person. So I suspect if it does happen it could be costly depending on what expected growth would be. But like many here I would pay for this.

1 Like