I am having the typical problem of a custom GPT hallucinating even when given sources and materials to use. It would be great to have a toggle that would say “force GPT to use retrieval and/or web search” to answer questions. That could reduce hallucinations, right?
If you specify in the ‘Instructions’ that your GPT needs to use the browser tool, it will do so. But this needs to be well positioned to have the desired effects.
I agree. I didn’t explicitly tell mine to give me links to webpages, but I asked it to provide specific examples and assumed that it would have provided links to the pages that it might have retrieved information from.
Is this working for you?
Are your custom instructions still working?
I have an assistant where I want it to reference 5 questions from a very simple file.
I clearly tell it in the instructions to only ask questions from the file.
50% of the conversations it asks a question that is not in the file.
I am super frustrated!
In this case, you can try creating steps for the model to follow, such as:
Step 1) Create 5 questions from the attached file.
Step 2) Verify that all five questions were constructed solely from the file content, step-by-step.
Step 3) Redo the questions that did not come from the file using data from the file.
Step 4) Show the final 5 questions created from the file.
This approach of asking the model to do something and then checking it contributes to better results.