AI still hallucinating with RAG. Any idea how to fix it?

Hello. I have a problem with an AI assistant. Basically it uses RAG and if the question is specific like “how do I buy XYZ item in your store” and the knowledge base information returns general instructions on purchasing any item then the assistant will assume item XYZ exists in the store, even if it doesn’t. I’ve tried to add so many things to my prompt to fix this, including adding very specific examples. Still, it basically assumes item XYZ exists in the store and says “here’s how you buy XYZ…” and continues with general instructions.

Has anyone else dealt with this issue before? How did you solve it?

Sounds like your system prompt might be a bit long and/or complicated.

Any way you can share?

How much data did you upload? All in one file? Separate files?

I would see first how to analyze the user intent, brek it into “solution” steps, do the steps, then answer.

Here it would be something like:

user wants to buy XYZ in our store.

# Step 1: Check if the item is part of the usual inventory (SKU Check)
if SKU exists for XYZ item; then
    # Step 2: Confirm item availability
    if item is available in stock; then
        # Step 3: Provide purchase instructions if available
        if purchase instructions exist; then
            display purchase instructions and link to product page for purchase
        else
            show default buying proccess (add to cart / checkout) and redirect to product page
        fi
    else
        # Item is not available; offer pre-order option
        offer pre-order option or notify user when back in stock
    fi
else
    # SKU does not exist; ask for clarification and/or suggest equivalent items
    ask for clarification or suggest equivalents from existing SKUs
fi

# Final Step: Offer further assistance if needed
offer additional assistance for any further questions

Would be good if you thought the whole process through with defined responsibilities of the bot

Ideally,

  1. you define the API to interact with stock, website content and RAG engine,
  2. then you specify the actions the assistant can do (and what to do when the assistant is not allowed to perform the action/action does not exist)
  3. and you need to have the user intent analysis tool + a planner (planners would be good to connect them to a RAG to see in the engine how the issues/inquiries are to be solved by default)

This is a very specific solution but I need it to apply to multiple situations where the AI is given specific but incorrect information by the user (How to buy item XYZ) and RAG information is general (how to buy any item). Then the AI presents the general information as specific information (Here’s how you buy XYZ… ) which misleads the user.
I’m just not sure how to solve this without having to write a detailed procedure for every case.
Even if I say in the prompt, “if item XYZ is not listed in the information by name then say you only have general information” It does not listen to the prompt… :frowning:

Write it as a general guidance on how to handle user requests about a specific product where you would use placeholders for product fields (like {product.name}). Or better write a detailed instructions for a product inquiries specialized assistant to whom you’ll delegate all inquiries about products.

Or you can push it even further with “department managers” like “stock manager”, “product info manager”, sales team etc.