Odd memory problems, memory not saving

Now with the rollout of 4o, and several full memory wipes I had thought the worst was over. I simply saved memories into a separate txt file, and saved them until hopefully the wipes were over. However, now there is a new problem.

GPT is triggering a memory entry fine, its telling me its doing it, the icon pops up and I can see if typing in the memory entry. Yet when I go to look in the memory ‘bank’, there is no new entry. Sometimes memory entries save to the bank, sometimes they don’t (mostly they don’t).

Am I missing something? I have several longer memory entries, and several shorter ones. Is there only so much space for these and I’ve reached my limit? there’s about 15 entries in the bank. Are there some memory entry tricks I am unaware of that are keeping these memories from saving properly?

1 Like

Same, I’ve been having a similar issue as well with memories no longer getting updated to the memory list after several dozen entries.

You get to several dozen? Im at 13 and its just stopped recording memories at all. I mean, it pretends to, says it does, even shows the memory recording thing, but they dont end up in the bank.

Yeah, I submitted a ticket on their support site a couple of days ago regarding the issue. My memories can’t go past 38, so I assume there’s a hard cap on the number of letters or words in the memory list before it stops updating. I think I saw on Reddit that the limit is 1,300 tokens, but I’m not entirely sure what that means.

Ah. think of it like a word limit. So your memories ate one sentence each I imagine?
For example:
“This sentence is 11 tokens and 45 characters.”
That was as of GPT-4, 4o was supposed to be more efficient so it would be less tokens now? say 8 or 9? Meaning the memories have a limit, probably at their end to not overload their ability to store the massive amount of information being generated. Which is probably what all of the random wipes have been about. And why we cant save any new memories after a set amount. I just wish they had a character counter or something to notify us that we hit a limit.

Oh yeah, here is a link to the token counter btw:
https://platform.openai.com/tokenizer

Thank you, this makes sense. I’ll go over and mention this to the support team.