Is there a change in the conversation history size?

I have a very big prompt, my prompt is made up of 1400 tokens and for it to work I enter it in blocks in 10 interactions

Last week it worked like a charm, each answer was 300 tokens and 40 answers by conversation with few problems, but this week OpenIA updated ChatGPT and now it only remember approximately the last 10 interactions or 4k tokens (or lesser) my test are terrible and ChatGPT doesnt want cooperate. Certainly even though it is GPTv4 I have not seen those results since GPTv3.5 a year ago

Also, last week It had the ability to count the words and tokens in the conversation and now It refuses to do so.
Before the update the answer took its time, but was correct, now it is fast but completely wrong because is using only the last answer

Yes there’s been several forum pop-ins that report that the conversation memory of ChatGPT has become abysmal.

That’s just part of OpenAI’s “optimization”. The chat management is a separate system from the AI model itself, and sending more of your past chat for each question you ask takes computation.

You might just get your notepad++ ready, and make an entire prompt that can be pasted and acted upon as though you were sending it to a new chat conversation each time. And then actually start new conversations (or edit the top one) if the AI still becomes hung up on what it just produced.

Ok, validated with ChatGPT Data Analytic, before was 8k (or 4k?), now is 2k
Again, my prompt is 1400 tokens, each answer is 300 tokens and usually first 3 answer is training, late training usually first 10 answers are ok and late I use my notepad to fix mistakes (by example mistakes in the battle narration)

Note: My prompt is a open world Slice of Life RPG interactive fiction (Visual Novel), a week ago I was generating 50 answers before the chat crashed my browser by ram used, now GPT make a mess with the story in the 3th answer and forgets who is the main protagonist =/

This picture was taken late insert my prompt and training answers, I jumped at understand the count token/word

4/11: GPT+Dalle is a mess, GPT with Data Analytic is thinking but capped at 2k, I did more test and in the moment that ChatGPT forgot something I do a count words and allways is ~2k =/