How to make context Awareness for follow-up question on chatbot

I am trying to put together a Q&A chatbot which can remember the “context” of the previous chat exchanges with a user, to provide relevant answers. Example:

User: Can you show me some Nike shoe models?
AI: Here you go! <Presents shoes 1, 2 and 3>
User: Cool. I like 1. How much does it cost
AI: Shows the cost of Shoe 1
User: What colors does it come in?
how can i do that in python

Welcome to the forum…

So, you have your System message then a User message then the Assistant response, then a User message, then the Assistant response, etc…*

import openai

# Initialize OpenAI API
openai.api_key = 'YOUR_OPENAI_API_KEY'

# Initial system message to set the behavior of the assistant
messages = [{"role": "system", "content": "You are a helpful assistant."}]

def chatbot_response(user_input):
    # Append user's message to messages list
    messages.append({"role": "user", "content": user_input})
    
    # Make the API call
    response = openai.ChatCompletion.create(
      model="gpt-4.0-turbo",
      messages=messages
    )
    
    # Extract the response text
    output_text = response.choices[0].message['content']
    
    # Append assistant's message to messages list for context
    messages.append({"role": "assistant", "content": output_text})
    
    return output_text

# Testing the chatbot
user_input = input("You: ")
while user_input.lower() != 'exit':
    response = chatbot_response(user_input)
    print(f"AI: {response}")
    user_input = input("You: ")

In this example:

  1. The conversation starts with a system instruction, setting the behavior for the assistant.
  2. Each user message is appended to the messages list, which maintains the context.
  3. After obtaining the assistant’s response, we also append that to the messages list to maintain the ongoing conversation context.
  4. This allows for back-and-forths where the user can refer back to previous messages, and the model will provide contextually relevant responses.

Keep in mind that if the conversation gets very long, you might exceed the model’s token limit. If that happens, you’ll need to trim or manage the messages in a way that keeps the most relevant context without exceeding the limit.

1 Like

you am creating my own custom chatbot using tensorflow, my chatbot respond well but it lacks context Awareness for follow-up question, thats want i want to inplement in my chat please guide me through if you or anyone knows how to do it.

You just have to append the previous user and assistant messages in order. Super easy.

Another way to o it is to use the “standalone question”. This is how it was explained to me: Chat Completion Architechture - #2 by AgusPG

I liked it so much that I even did a video on it: https://youtu.be/B5B4fF95J9s