Why ChatGPT 4o trying to convince me wrong facts?

For a few messages straight it tries to convince me that Biden is current president of the US and that is elections will be Nov 2025. I tried to direct him to that he is mistaken but it still insist he is right. Only after I instructed him to check internet facts he agreed with me. What’s going on? Last weeks his logic degrading. More often it refuses to follow strict instructions that worked fine before (I use it almost a year now). It becomes very hard to use it for something rather than casual chats about nothing. What’s going on?

1 Like

It is normal because:

Its knowledge is limited to information available up to June 2024, so it should be instructed to search the web for the most up-to-date details.

I don’t think its normal

  1. In may 2025 it should know that elections already done and check facts but he tried to convince me that elections will be in Nov 2025. He should know that elections was in 2024.
  2. What’s the point in AI if every time I need to ask him to update well-known facts? In good old 1990 I used to program on BASIC in school and I wrote some primitive chat: Hi - Hi - How are you - I am fine and you? - Me too. What you just explained is not too much from my 35 years old model. Correct me if I am wrong.