GPT3 Question Answering,

Hey there,
Regarding the GPT3 Question Answering,
We used GPT3 Question answering model with a Wikipedia link as input, the model provides out of context answering, is there any ways we could avoid it?
eg, if we provide a context regarding American Economy and ask the model what is the capital of India ?
It provides Delhi as output but we expect query not found .

if you are looking to use wikipedia to provide context to the Q&A endpoint you should use the wikipedia api and send the users input to the wikipedia and get back a summary of the top 2-3 articles, then concatenate them and use that concatenation to feed to GPT-3. I did this with my question answering program and it works quite well although you need to make sure articles will pull up because with some very specific questions irrelevant/ no articles pull up. This is just my current solution to using wiki articles as context but there could very well be a better one.

1 Like

I do not think there is any sure-fire way.

I would consider breaking it down into two tasks:

  1. Extract the information that answers the question; OR abort if there is none
  2. Reformulate the extracted information to the desired format.

This should limit the model’s ability to recall, infer, or dream up answers.

However, at the end of the day, you cannot fundamentally separate the model’s understanding of the world from its understanding of the language. So if it is important that the application only draws answers from contexts, I think you should evaluate it against a dataset.


this is nice @cwenner curious to know how it works with updated endpoints