Less then 8k of context but gpt forgot it


I made this observation during my work with gpt-4:
I gave GPT several lines of Informations. 4 sites with 1.500 words. To check if GPT got everything that is needed I asked for a specific line.

For example: in line xx I gave Information about a red ball. Then I asked: Do you remember what I told you about the red ball?

The answer was suboptimal: At this point of our conversation you didn’t mentioned any red ball. I am a bit confuse. I though that gpt-4 has a context token window about 8k. I wrote about 1500 words and the Information was in the middle of the text but gpt was not able to remember.

Is that a new behaviour? Went something wrong? Have I made a mistake?

Edit: I tried it with Claude 2 and he was able to remember. Every line in the text.