Dealing with context switching in a conversation that uses embeddings for information retrieval

It’s been a lot of fun to create a Chatbot that leverages embeddings to access factually correct information from a Knowledge base using GPT-Index and Langchain, and then serve that up to the user using the Completion API.

What I am struggling with, is to find a natural way to deal with content switching by the user, without resetting the whole conversation. I.e. asking a question about a new topic while the bot was answering questions about another topic.

In a perfect world, the bot would understand when it’s time to do a new embeddings search, but I can’t figure out a way how to do that. I think I can just hard code some keywords that instruct the bot to do a new search, and explain to the user how to use those, i.e. “I have a New Question”.

A variation of the question above, what if the user wants more / deeper information about the same topic? Or a more extensive version of the answer. Just preload it with more context (embeddings search results).

Thanks for the help!

2 Likes

Maybe play with a prompt that would take in the user input and grade it on whether it’s time to switch topics or not. Give it a few-shot example. Sending to Curie or even Babbage might work. And those are fast and cheap…

1 Like

Thank you, I will experiment with that.

I was assuming this would be a pretty common problem for Chatbots, so hoping someone had already solved it :).

Only workaround I found is to show the anticipated topic (intent) above the chat and a button to reset it. Reset simply removes the chat history.