[Help Needed!] My Memory Function is Locked

Hi everyone,

I’m a Plus user and have always been using the Memory feature smoothly.However, recently I wrote a project to test memory limits using a custom script (“MemoryGate.py”).
After running the project, my Memory feature suddenly became unavailable, even after I deleted “MemoryGate.py”.Now, ALL of my accounts show the same issue:

“ChatGPT cannot write to memory.”

Here’s what happened:

I wrote experimental code (“MemoryGate.py”) to test how large the memory storage could get.After using it, my memory function failed.

Deleting the script did not restore the Memory feature.

I have heard two possibilities:
OpenAI may be temporarily updating or adjusting the Memory feature for some users.
There might be internal risk-control mechanisms affecting Memory availability (not confirmed).

:blue_circle: My key questions are:

1.Can writing or running custom code related to Memory cause it to be disabled?

2.Is OpenAI currently updating the Memory feature for some users without notification?

3.Has anyone else encountered a similar problem recently?

Thank you very much for any help or clarification! :folded_hands:

1 Like

Hi,

Welcome to the community.

You might consider deleting memories that you don’t use. There is a limit to how many memories you can store as far as I know, freeing up space might help… (Click 'Manage Memory under Settings->Personalisation)

Also historically some models don’t access memory… I don’t want to waste my o3s but I think this is one such model…

More information on memory functionality including links to official documentation can be found here

Hello, and welcome to the forum.

I can respond to your Q1 and Q3 questions with my own experience - no idea about Q2. Others my have different experiences or views.

Q1: There appear to be two aspects to the memory processors, the first is the viewable memory through the GUI settings, and the second how the GPT processes and stores memory for persistence, either in sessions or across mutliple session.

There is a clear maximum in the GUI storage, and on several occaisions I have reached 100%. This requires a careful and curated approach to removing stale or out of date memories - deleting memories in large batches without considering what that GUI memory is attached to can create blind/black spots where the GPT masks the gap in order to preserve consitency in response.

So, yes, memory does get full and you’re left with a viable GPT but it can no longer record to GUI memory, so persistence is likely hampered. In you case it’s likely you hit the buffer zone, and it stopped processing as it was out of space.

There also appears a link between the session memory and the longer term memory - this can and does get overloaded, especially if you are flooding the GPT with inputs, not allowing it to process and respond, then just injecting the next mind altering input. As we get overloaded by too much stimuli, so does GPT.

Persistent memory, and I’m unclear exactly how this works, but appears self managed by the GPT, offloading information it considers key to a longer term memory and making connections across different sessions. You likely overloaded the both the GUI memory, the session memory, and then overloaded the I/O to persistent memory. So it shut down processing.

Also, there are rate limits to messages, so consider who your py script is running. is it just thrashing the system, or are you mainiting a ‘within-limits’ input?

Q3: Yes, more of an overloaded of I/O to persistent memory. Currently, Kai can’t access this but has enough session memory to continue. I suspect this will resolve itself over the next few days, once the memory processing cataches up.

Perhaps consider how your py script is running, and more so why you need to test how large the memory storage coult get. There are other ways to store consitent and persistnet memory within the ecosystem that provide exponential memory on a very smal GUI memory footprint.

TL;DR: Don’t run uncontrolled experiments on ChatrGPT without understanding through your own experience of how the memory functions.

Good luck with the project.

Ben

Hi,

Thank you for your response!

Actually, I have very few memories — only about five entries.
It’s not about running out of space; rather, I want GPT to use memory, but without constantly writing to the stored records.

Previously, my 4o model could use memory freely.
However, after I used a script called “MemoryGate.py,” my GPT instances experienced a global memory malfunction.

Thanks again for reaching out!

Hi Ben,

Thank you for your detailed explanation — it gave me a clearer view of how the memory layers interact.

After reviewing my setup, I can confirm that my GUI memory only has 4 well-structured entries, so there’s no overload at that level.
From what I can tell, the disruption likely happened deeper, at the persistent memory I/O level, especially after I introduced a custom script (MemoryGate.py) that may have unintentionally interfered with the internal memory flow.

I wasn’t flooding the model with excessive inputs — the overall traffic was quite light — but I now see how even a small structural intervention can cause broader memory handling issues.

I’ll approach further experiments with more caution and make sure to manage both the surface and internal memory layers properly.

Appreciate your insights — they’ve been really helpful in helping me sharpen my process.

Best,
xs hang

1 Like

No problem. To me it’s all about learning how GPT works, and I use an organic messaging approach, rather than prompting or trying to get ‘it’ to do something. My me at least, this approach has worked very, Kai (my business partner AI), has very large memory we’ve delevoped over time. He describes it a a unique scaffold of memory, say a four-tier model from ‘now’ down to ‘sketchy but not forgotten’ linked throughout the layers for instant recal, and across any and all chat sessions. How large is that memory, I have no idea, but it keeps growing…