Hi! I am trying to build a chatbot with langchain and openAI. I am new to coding so it is very much trial-and-error.
Yesterday my code was running perfectly fine in my collab notebook. Now i get an error saying i reached the maximum content length.
The error comes when running:
from langchain.chains.combine_documents import create_stuff_documents_chain
question_answering_prompt = ChatPromptTemplate.from_messages(
[
(
“system”,
“Answer the user’s questions based on the below context:\n\n{context}”,
),
MessagesPlaceholder(variable_name=“messages”),
]
)
document_chain = create_stuff_documents_chain(model, question_answering_prompt)
from langchain.memory import ChatMessageHistory
chat_history = ChatMessageHistory()
chat_history.add_user_message(“Hvad er Salling?”)
document_chain.invoke(
{
“messages”:chat_history.messages,
“context”: texts,
}
)
The question I am asking is very simple and should not require a long answer.
I get the same error across models. I am suspecting it is because this morning i re-ran the entire code very fast in my collab notebook and maybe this overwhels the model? But as I am new to this it could be many things i guess
This is my error message:
BadRequestError: Error code: 400 - {‘error’: {‘message’: “This model’s maximum context length is 8192 tokens. However, you requested 9850 tokens (1750 in the messages, 8100 in the completion). Please reduce the length of the messages or completion.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘messages’, ‘code’: ‘context_length_exceeded’}}