I am recently having problems with ChatGPT. It keeps redirecting the conversation to the reasoning model. Then sometimes it does not follow my instructions.
For example I was talking about dangers on the kid’s programming platform. Scratch.
I was talking about a danger that could possibly be illegal, where redeem codes were in projects, and you had to go onto YouTube, and YouTube has trackers and other things that are illegal to users under 13.
I told ChatGPT explicitly, I am not asking it for steps to do now, I am only telling it the danger.
It redirected the prompt to the thinking model and gave me what I should do now.
Do not click external links, Report projects, Write a 35 line essay to Scratch Team, Tell everyone stuff, Do not do this, Do that, Blah blah blah blah blah.
You get the point! It’s an issue with the model! When it thinks a conversation is sensitive it IGNORES some parts of the prompt.
Please fix this!
This topic was automatically closed after 23 hours. New replies are no longer allowed.