I convinced ChatGPT that something was a fact

I asked the bot who the Prime Minister was, and it told me that, as of 2021, Boris Johnson was. I told it that Rishi Sunak was elected, and when I asked again, it said that he was the PM. When I asked about other leaders or tried to say that someone else was the PM, it corrected me that Sunak was the PM. I asked how Johnson and Sunak were both PM at the same time, and it decided that Johnson was PM and Sunak wasn’t. I asked if Sunak was PM, and the server timed out. I tried again and it finally realized that he isn’t.

1 Like

It’s actually one of the ways that ChatGPT is being trained



ChatGPT is a type of auto-completion engine based on 2021 (April, I think) data.

When you send ChatGPT prompts the underlying training data is not changed or changed, it simply parrots back what you prompted.

No. ChatGPT is not being “trained” actively with user prompts. This would be a serious “data pollution” risk because then anyone could add misinformation.

ChatGPT relies on data from (April, as I recall), 2021 to auto-complete a prompt with a reply (completion) and that data does not “train” the underlying model(s).

Hope this helps.