i think the only way and the best way for now is to keep the previous conversation in prompt, or make a summarization for what happened in the past, or combine both. As the ChatGPT app itself stores our last conversation in the prompt. CMIIW
so essentially if its a very long conversation, to allow it to remember and have context it would be too long a conversation to be sent in each time. And the number of tokens would grow too?
just a random thought is to have keywords/concepts kept as the conversation grows and send it over for each chatcompletion call? (obviously all these information have to be kept within our own systems).
I’ve been messing around with context and continuity with gpt-3.5-turbo. Mostly as a distraction not as something serious. I’m using a transformer to tokenize a file of mostly facts and also the dialog history. The idea is that when you type a prompt if there is a related sentence it is plucked out of the text files and included with the prompt. It works pretty well. I can chat with the bot for quite a while and then ask it what my name is and it responds correctly. The code sucks, but here it is Bitbucket