chatGPT context constantly has amnesia

I have found that when I am working with chatGPT things have changed in terms of the way context works.

A normal conversation leaves out various things, like if I have gpt write a program then I may ask about it without specifically pointing out that it was that program + the 10+ other aspects of context (that is what a history is for right?).

chatGPT acts very “dumb” or is missing brain cells it seems lately. It will just start redoing the work from scratch. Then I stop it and say “no use our previous conversation discussion” and it does it with the right context.

So has chatGPT been turned into some sort of “quick one shot” more than a chatGPT it is a intstructGPT and that is really not very useful.

I have to tell GPT to remember what was just output, it’s not context size or anything other than it not looking or ignoring my previous disussion thinking we are ‘starting again fresh’.

Seems like a huge bug, not sure why it isn’t noticed that this probably is the cause for all the “GPT doesn’t code well anymore”. Yes, because it acts like it constantly gets a labotomy unless you remind it you have been speaking previously.

I would post a chat, but to get chatGPT to do the work now seems to require langauge that won’t let me share it because of “moderation” :smiley: hah, if you don’t talk harshly it doesn’t do the work, which is sad. Why do we have to cuss at it like a sailor to squeeze out good work? It works, honestly, just get angry and boom output where it told you it wouldn’t, couldn’t or didn’t seem to remember the last prompt and history. Suddenly it knows everything, rediculous. I hate that I have to be rude to get good work from chatGPT now.

So another aspect, chatGPT will become much better the harsher you talk to it. Why? I don’t agree/advocate or like this fact, but it seems to get the jackpot everytime.