Chat Completion Architechture

I explained this idea of the stand-alone question here. Glad to see that it’s being used in other places! :slight_smile:

You need to “de-contextualize” the question in a conversational context to do a proper semantic search. Imagine the conversation:

  • User: What is the capital of Spain?
  • Assistant: It is Madrid.
  • User: How many people do they live in there?

If you embed the question “How many people do they live in there?” and conduct the semantic search, you won’t retrieve documents that talk specifically about the population of Madrid. This is because it is a “contextual” question: it only makes sense in the context of the on-going conversation. You can solve this by using a module that de-contextualizes the contextual question into a “stand-alone” one. Something like “What is the population of Madrid?”

You can easily achieve this with an additional call to OpenAI. In my case, the “chat history” is only composed by the previous QAs: no documents. You don’t need the supporting documents to reformulate the contextual question into a stand-alone one: previous utterances are enough. In fact, I only send three QAs pairs and that’s usually more than enough to produce the stand-alone question.

Hope it helps :slight_smile:

6 Likes