Maintain context for a chat bot based on embeddings?

So I ask a question ,the question is sent to embeddings api, seach for similarity in the database,add the most similar embeddings to the context,get a response back - till now,all good.

Now,if I want to maintain context - Lets say the question is “When was the American Revolution?”, I get the answer “The American Revolution began on April 19, 1775.” and the next question is “Why?” the answer is “I apologize for the confusion in my previous response. Based on the given context, it does not explicitly state why the American Revolution began. Therefore, I don’t know the exact reason for why it happened.”

This is the code

$result = $client->chat()->create([
                'model' => $completionModel,
                'messages' => [
                        'role' => 'user',
                        'content' => $context

                'temperature' => 0.7,
                'max_tokens' => $max_tokens,

the messages array are the 3 previous questions and answers,question with role user ,answers with role assistant.

$context = "Generate an answer from the Context or from the messages
                        If you cant find the answer say I dont know.
                        Question: $ques
                        Context: ";

Context is a prompt where I concatenate the embeddings text based on similarity.
As I see it, the problem is in how the “Why” in the second question is converted to an embedding which is not very similar with anything so the query doesnt find anything,no matter how many messages I pass to the chat, or , in other words the messages would work if I didn`t pass a custom context (that is , I would use chatgpt general knowledge),

Any thoughts on this are appreciated, thanks for reading.

Did you ever figure this out? Have a similar q

Nope, unles I parse the previous questions and answers and somehow create a new question under the hood, too complicated and prone to failure.

1 Like