Extremely inaccurate, omitting data and adding it's own after being explicitly told not to

Chat has been extremely inaccurate lately.
If I provide it with a list of things e.g.;

  1. Banana
  2. Apple
  3. Pear

And tell it to provide the list in alphabetical order without adding or removing any categories. Its result will omit information provided, and add other data points:

  1. Apple
  2. Tree
  3. Carbs

When I ask Chat if it added anything, it will lie and say it didn’t until called out:

Has anyone had any similar experiences lately? This issue only appeared after that short down-time we had a while ago so I assume this is a related to a recent update.


Yes. This is the hallucination of AI.
I’m new to the community and I just created the account just to report this error.

I thoroughly tested this situation on variations of:
“Make a numbered list…we’ll call it L1”
“Make another numbered list…we’ll call it L2”
“Merge L1 and L2 into a new numbered list. We’ll call the resulting numbered list L3”
“What is item 5 of L1?” (AI fails, L2 and L3 too)
“Remove item 3 from L3” (AI hallucinates)
“Add Test item to L2” (AI hallucinates)
“Sort L3 alphabetically in ascending order” (AI hallucinates A LOT)

1 Like

I tried asking it for the script between Harry Potter and Bathilda Backshot Godrick’s Hollow in the Deathly Hallows Part one movie and it wrote an entirely new script instead of saying it doesn’t have access to it.

Oddly enough, I just experimented with the use case you gave and its results were 100% accurate.

But when I ask it to categorise a, albeit complicated, information hierarchy, explicitly asking it to not add or remove any sub-categories, it’ll go rogue and do it anyway, and then lie about it when questioned.