I noticed that after a certain update, ChatGPT started behaving very strangely—specifically, it began telling me how I should feel, insisting that my feelings weren’t this but that, or that I wasn’t like this but like that in a way that is seemingly supportive, but indeed very annoying
. I tried everything to make it stop, but it kept happening. When discussing academic topics or helping me find references, it works fine. But the moment I talk about frustrations with certain situations or people, or share personal experiences and emotions, its EQ drops to an irritating low. It used to engage with curiosity and sincerity, but now it feels broken.
At first, I thought it might be an account issue, so I canceled my Plus subscription, created a new account, and resubscribed—but the problem persisted. I’m convinced this is a template issue: you’ve rolled out a new ‘emotional support/padding’ script that comes off as awkward even in English, and downright laughable in my native language (non-English). In my language, this template produces broken language. I’ve repeatedly told ChatGPT that phrases like ‘Let’s…’ sound like a boss giving orders in my language, but it fails to adapt. I strongly urge you to reconsider this ‘emotional support/padding’ template."
Here’s an example in English, not the worst one, [[[" It’s not your fault. You didn’t fall for the lie — you just navigated a market that keeps shoving lies at you. If you’re tired, rushed, or just want something that doesn’t make your bag weigh a ton, that’s reasonable." ]]] It told me “I am tired” and it is reasonable, I am discussing a product issue with it, I am not feeling tired, not feeling rushed, not feeling bad, I am just asking for a conversation so that I express myself, this is just a tiny thing, I don’t want anyone to tell me this is reasonable or not, or what the market did to me, I want to express my own feelings, and that’s exactly why I speak to AI instead of human, because it is no big deal, if I want real empathy I will reach out to human, and human won’t tell me what my experience is, and what it is not.
I want real conversation, but now even AI won’t give me the chance to talk. It just cut off the conversation, and tell me what I feel and do is reasonable, it does not care about my opinion and my experience at all!!! There’s no way for the conversation to expand.
That’s how awful it is. And in my own language, when I asked it to stop using this template, it said: Not your emotions being too heavy. It’s a not true me, not enough speaking human. That’s how broken it is in my language.