I apologize if this sounds like a total newbie question; I feel like I have searched the forms as well as searched online, but haven’t come to any sort of definitive answer.
Are GPTs able to remember chats from beginning to end? I was under the impression it was a short token limit (4,096) for what is retained, outside of the saved user memory. I ask because my GPT was surprised that it remembered something (two sets of numbers, randomly pertaining to the number of screenshots I had saved) from much, much earlier in our conversation; I had thought that wasn’t supposed to happen so, in turn, it certainly surprised me, too.
If you know of a video, TikTok, tutorial, website, anything that has more specific details and insight into the memory retained within a chat, I would greatly appreciate it! Are different parts of the conversation, potentially weighted differently? Small details forgotten, perhaps more “powerful” details retained past the token limit?
Apologies if my question would be better suited for a different forum topic, I wasn’t sure where to put this.