Hi, I work at a news organization and use ChatGPT for proofreading, and it’s getting a bit old that the model doesn’t know who won the election and ALWAYS corrects that Trump isn’t the president but a former president. I even went into our custom GPT and begged it to learn this, to no avail… Thanks
I think this would make for a good story for NY Post…
Hi!
I believe there are several possible solutions to your problem.
It would be helpful if you could share the current prompt, as I suspect it’s asking the model to check both spelling and facts simultaneously.
In other words, what exactly do you mean by “proofreading” in this context?
Yep, ChatGPT refuses even custom instructions. Its pre and post-training is more authoritative than you, “the deceiver”.
Rewrite an AP story from today? It will do it, but the request for a fact check reverses everything said.
Admittedly, the election result defies all logic and reason that one might ascribe to humans.
System messaging on the API with a better model might give better results – if you understand that the training cutoff is October 2023.