I have the same problem, i found very difficult to force the model to only use the context. Here is my prompt
“”"
Please read the context provided below:
CONTEXT
{context_str}
Based solely on the information given in the context above, answer the following question. If the information isn’t available in the context to formulate an answer, simply reply with ‘NO_ANSWER’. Please do not provide additional explanations or information.
Question: {query_str}“”"
So far I found that:
1- gpt-3.5-turbo is very hard to only stay with the context
2- gpt-4 works well but is expensive
3- Google Vertex AI text-bison seems to work very well and the price is like gpt-3.5-turbo, the problem is that the responses seems to be shorter not so gentle.