GPT forgets previous messages after knowledge retrieval RAG

I have a single ~1700 line .txt file that the GPT successfully retrieves from to answer the first question in the conversation. For any later message, the GPT will retrieve knowledge but then completely forget the previous messages.

Here is an example:
https://chat.openai.com/share/0e1057c9-619b-4c03-8bb7-77f924644755

In this example, it retrieves its knowledge and answers my question, then I ask it a followup and it retrieves its knowledge and then completely forgets my first question, it makes up an answer to my my followup + a question I never asked it.

I suspect that either:

  • A bug exists where the first part of the conversation is deleted upon subsequent instances of knowledge retrieval
  • Or the amount retrieved is too large and fills up the entire context window

Furthermore, you will notice in my example it is writing URLs in codeblocks, this is my workaround for the non-clickable hyperlink bug that still occurs for me on desktop Chrome (but not mobile).

Hoping these are fixed soon, then my GPTs will be much more usable.

I was about to call your post into question because it doesn’t look like a lookup happened for the first post, but it appears that citations are currently also borked.

Are you talking about the white dwarfs thing?

That could be an actual bug. Sometimes the models will ignore the prompt pull something out of the training set. It’s possible that we’re seeing a very specifc case of this here.

Is the question actually answerable with the context you’re giving?


ChatGPT is kinda difficult to debug. Sometimes it just does weird stuff and there’s nothing we can do other than regenerate and hope for the best :confused: