Train back and forth dialogues

Hi @lechnerf

This “memory” problem of GPT-3 is very common for the chatbot scenario. There’ve been quite some post on the community with potential solutions involving using GPT-3 to condense/summarize the previous conversation to retain context, so as to make the most out of it.

My hypothesis is that it can be solved with a “rolling memory” i.e. remember the most recent N tokens. I haven’t got a chance to test it out though as my grant expired in October last year.

1 Like