After asking for a type of organization web page, I got completely nonexistent websites and pages! I even searched for any trace of the organizations listed, getting nothing (except that one acronym referred to an organization that was completely different).
So I escalated with the following prompt and set Temperature to 5: Please provide good sources of information regarding ethical guidelines for providers of residential addiction treatment in Mexico. The language will probably be in Spanish, which is I prefer. Sources dedicated to psychologists and psychotherapists in general are not appropriate because I am seeking any guidelines pertaining to facilities that may or may not have licensed staff. Do not create any organizations. Only indicate organizations that actually exist.
The result was more of the same, completely false organizations. So I “confronted” it and asked for a hypothesis as to why this was happening. It insisted that they were all real and provided “evidence” by repeating all the pages. So we’re arguing over what is real. Funny, but frustrating.
Any ideas as to what I might try to prevent this? By the way, the results were very impressive. All the names sounded completely authentic!
Yes, I know. My question has to do with ways to eliminate hallucination, which I prefer to call confabulation because it means making stuff up. Not that I have anything against saying hallucination, after all, it’s a fun way of anthropomorphizing the AI.
Airtable the outset, there are several prompt issues here. The biggest one is ambiguity; “good” sources by what definition?
GPT models always have a trained-to-date. How can you expect it to give you good sources if the definition of good includes providers of residential addiction treatment that are actually in business at this moment?
Your premise for using AI in this manner is flawed; you are expecting the model to be a real-time representation of the web. It is not that by any stretch.
What you might be able to do is use GPT to tell you where to dig for that information and then use subprocesses that carry out the detailed tasks of acquiring the data. I recommend you take a look at Bardeen.ai. It can use GPT to locate data sources, and then scrape them into structured lists.