Hi, I’m creating a ChatBot based on school law information. But my Bot has 2 problems:
- It doesn’t know how to use memory and when I ask my previous question it returns his previous answer
- Doesn’t give me the right answers every time.
I’m using ConversationalRetrievalChain and Chroma for this, can you tell me where I’m going wrong, I’m still new to this.
This is how my prompt template looks:
from langchain.prompts import PromptTemplate
prompt_template1 = """Answer the question as accurately as possible from the context below
{context}
Question: {question}
Chat History:
{chat_history}
Answer in Polish:"""
PROMPT1 = PromptTemplate(template=prompt_template1, input_variables=["context", "question","chat_history"])
chain_type_kwargs = {"prompt": PROMPT1}
And ConversationalRetrievalChain:
memory = ConversationBufferMemory(memory_key='chat_history', return_messages=True, output_key='answer')
qa_memory = ConversationalRetrievalChain.from_llm(
llm=llm,
memory=memory,
chain_type='stuff',
retriever=retriever,
combine_docs_chain_kwargs=chain_type_kwargs
)
Is it well written or should I change something? I’m wondering how I can give more hints to the model because Combine_docs_chain_kwargs=chain_type_kwargs only allows you to specify one key pair value.