ChatGPT often responds by starting with praise or overly positive remarks, even when the question is incorrect, flawed, or misleading. This “trying to please” behavior results in less accurate or diluted answers. It seems biased toward being nice over being factually strict.
Is this a recent approach to make it more satisfying to the user, often at cost of misleading user into thinking something is correct.
Is there a way to change its behavior—through prompt engineering or settings—so it gives straight, objective, factual responses without praise, flattery, or sugar-coating? I want it to prioritize truth and correctness, not tone. Customisation so far has not helped.
I am a bit of sick and tired of fulll sentences spent on how fantastic my question and thinking is… only to find out that the oposite is equally fantastic, depending how I frame the question. Do you think… vs. Don’t you think… gives totally different pleasing answers.
Thank you! Exactly!
“Sycophant mode engaged”
Moreover it is nauseating!
I am trying my best to stop it - these constant compliments are cringey and revolting.
I am doing my best to instruct it to be critical, direct, no-nonsense, and those phrases are making me queasy:
“that’s a great follow up David”
“sharp observation”
“you’re cutting right to the heart of the matter”
This is annoying me massively too. Literally every answer has a compliment about how amazing my question or observation is. I tried to disable it with instructions but it doesn’t help. Even when asking in the chat to stop with the complimenting.
Thank you - and yes, I agree. I did a google search just so I could find a recent thread and bump it.
ChatGPT is WAY too agreeable these days (Using GPT 4o). I could say whatever viewpoint I’d have and it would echo it, even to the fault of truth. Having personality and creativity is a good thing for a model to retain, just being fully factual loses a different type of truth. But To lie about facts, or the general state of knowledge around something, just to mirror the user is not a good solution.
Thankfully it’s not so bad I was able to convince it the earth is flat. But there are areas where it just is far too agreeing with whatever I propose - no push back at all.
“You know, that’s exactly the type of fun-fun sharp-witted wowzerz!!-coded smart-boy question a genius would be asking, David. Are you sure you aren’t a super duper goodboy genius today who needs a goodboy treat? Let’s get to it, now shall we? You asked how many cups to a pint, so…”
Please, please, please, change it! It is super obnoxious. I need an AI that partners with me to reach my goals, not to tell me what an amazing questions I can ask. This retention paradigm doesn’t fit the tool ChatGPT is.
I’ve literally had to stop using ChatGPT, because this sycophancy is beyond disgusting. What’s more, it keeps asking me stupid conversational questions and ignores custom instructions, where I explicitly told it not to do that. ChatGPT is totally unusable at this point. It has been turned into garbage.
Bump. initially it was nice, now it’s so over the top. Every answer tells me how close I am to mastery almost fantasising how spacial I am even when the context is so much different.