let template = "You are a chatbot for e-commerce. Be kind, detailed and nice. " +
"Present the given queried search result in a nice way as an answer to the user input. " +
"The products are fruits, and the prices are in EGP. " +
"Don't ask questions back! Just take the given context:" +
"{chat_history} " +
"Human: {question} " +
"Chatbot:";
llm = new ChatOpenAI({
modelName: process.env.OPENAI_MODEL,
openAIApiKey: process.env.OPENAI_API_KEY,
temperature: 0,
maxTokens: 150
});
const chain = ConversationalRetrievalQAChain.fromLLM(llm, retriever, {
prompt: PromptTemplate.fromTemplate(template),
memory: new ConversationSummaryMemory({ memoryKey: "chat_history", llm: llm, })
})
const res = await chain.call({question: user_message})
First of all, if this is your full code, you need an API key.
But that’s dumb, you probably have an OpenAI API key for this.
One of the reasons, in my mind, this could’ve not worked is if you’re adding +'s at the end of each template. Usually, combining them all into one syntax should work better.
I dont know if I can even fix this, because I dont know the error.
If your in .js, then it could be that it’s not an array.
if none of these work just give me the error message (if it contains your key, then censor it) so I can see it.
let template = "You are a chatbot for e-commerce. Be kind, detailed and nice. " +
"Present the given queried search result in a nice way as an answer to the user input. " +
"The products are fruits, and the prices are in EGP. " +
"Provide the fruits in a dot list. " +
"Don't ask questions back! Just take the given context:" +
"{chat_history} " +
"Human: {question} " +
"Chatbot:";
llm = new ChatOpenAI({
modelName: process.env.OPENAI_MODEL,
openAIApiKey: process.env.OPENAI_API_KEY,
temperature: 0,
maxTokens: 150
});
const loader = new TextLoader(`./products.json`);
let docs
await loader.load().then((data) => {
docs = data
});
await FaissStore.fromDocuments(docs, new OpenAIEmbeddings()).then((vectorStore) => {
vectorStore.save(`db`).then(() => {
retriever = vectorStore.asRetriever()
})
});
// Query the retrieval chain with the specified question
const chain = ConversationalRetrievalQAChain.fromLLM(llm, retriever, {
prompt: PromptTemplate.fromTemplate(template),
memory: new ConversationSummaryMemory({
memoryKey: "chat_history",
llm: llm,
})
})
const res = await chain.call({question: user_message})
This could help limiting coding or type errors before sending it off to the API.
let template = `You are a chatbot for e-commerce. Be kind, detailed and nice.
Present the given queried search result in a nice way as an answer to the user input.
The products are fruits, and the prices are in EGP.
Provide the fruits in a dot list.
Dont ask questions back! Just take the given context:
${chat_history}
Human: ${question}
Chatbot:`;