Hallucination / Confabulation ! Issue fixed : missing 'in context' info

We have built and deployed AI chatbots on our website www.techmeetups.com and all seemed good except now I am seeing increase hallucination / confabulation … whatever the term.
A recent prospect enquired about costs and although the chatbot has all prices it proceeded to give its own prices (1 & 2 in scnshot). Almost like price gouging !! (which it wasn’t ever asked to do!). ikt even made up some random links and provided it (3 & 4)
On further questioning it corrected itself and gave the prices available to it (5)
I will of course try to ensure it doesn’t do this and always ONLY gives prices available to it but did anyone else face this ?
Have attached the conversation scnshot & AI chatbot settings.
The chatbot is a GPT4 8K token rel Apr 23.

The crazy idea that comes to mind is that your ChatBot needs a ChatBot plugin to recognize that for pricing it should use the plugin to return the real data and then hopefully the ChatBot incorporates that into the completion.

As I have never tried this and not seen in noted or done, I note it as being crazy, but then again I have taken some really crazy ideas and they do work. Isn’t there a quote about crazy and genius? :slightly_smiling_face:


Another way to think about this is that the ChatBot needs the information about pricing. Currently, as I understand what you note, is that the ChatBot is getting the information from the vectors created based on the training set, or possibly something else, which between there and the completion is leading to hallucinations; in short is is a nondeterministic action.

You need a deterministic action, and plugins can provide that. But, as I understand it, you created the ChatBot and need a way for it to understand that when the attention is on pricing that you need it to use a deterministic action, while I noted plugin as the first thought, a better idea is to call a function as this is probably being done with an API.

Hopefully this shows that the idea is not tied specifically to plugins but that a way to get the ChatBot to use a deterministic action is needed.


Sorry guys on debugging the messages just realised what had happened. The AI chatbot didnt use background info in the first response so made up prices. :man_facepalming: My bad :pray:

Hi Eric yes that’s the other way to ensure consistency, using plugins or functions.
Actually if clearly specified ‘in context’ it doesn’t hallucinate. We have tested this. This particular issue was that the ‘in context’ info was missing. Has been fixed :+1: