13,000 tokens back - chatGPT can recite my paragraph, how is this? I tried to find why and it really does

I hate to be that guy, but do you have a source? :wink:

Either they do have a larger token window for this … or they’re using a summarize technique that remembers certain facts? Maybe it is a GPT-3.8?

I agree that it’s memory is impressive.

Oh, also, some of the endpoints and models do have a slightly larger token limit now, I think.

Hope this helps!