Is there any alternative method to implements follow-up question based on memory history using langchain and open ai?

from langchain.memory import ChatMessageHistory
history = ChatMessageHistory()

`
question = “How many cutomers with order count more than 5”
response = chain.invoke({“question”:question,“messages”:history.messages})

history.add_user_message(question)
history.add_ai_message(response)
history.messages

response = chain.invoke({“question”: “Can you list there names?”,“messages”:history.messages})
response

but this code is not working when i implement using our custom database tables?

Is there any other alternative method is availabe to implement follow-up questions based on history using langchain and Open AI model

Are you getting an error? Or just not results you expect?

Can you give a bit more details on what you’re trying to achieve?

Basically, whatever you send each call will be in the context.

Just not getting results what i am expecting.

Whenever i ask the follow-up question to previous question added in history it’s not taking previous question as input for follow-up question to achieve the results.