Maximum context length problem with AgentExecutor

Hello, I have a problem with the agent. I gave him data using VectorStoreRetriever - data is ‘Law on education and science’.
Memory with AgentTokenBufferMemory(memory_key=memory_key, llm=ChatOpenAI(temperature=0, max_tokens=512), max_token_limit=2500) and after 3/4 questions I get this error:

InvalidRequestError: The maximum context length of this model is 4097 tokens. However, your messages resulted in 4587 tokens (4541 in messages, 46 in features). Shorten the length of messages or functions.

Is it a memory problem? How can I prevent this error from appearing on the form?
I am using AgentExecutor for this chatbot.

1 Like

Hello, I hope you’re having a great day.

It appears that the problem arises from the conversation context becoming too extensive for the model being utilized. The AgentExecutor maintains a record of the complete conversation history to provide context for each new user input. Nevertheless, there is a limit to the extent of context each model can handle.

Here are some strategies you might consider to prevent reaching this limitation:

  • Employ a model with a higher context length threshold. Some models, such as gpt-3.5-turbo-16k-0613, can manage up to 16,384 tokens of context.

  • Regularly clear the conversation history stored within the AgentExecutor. Achieve this by establishing a new AgentExecutor instance every N turns.

  • Use a Memory module like AgentTokenBufferMemory to keep long-term memories separate from the conversation context. This approach helps maintain a more concise core conversation context.

  • Configure your AgentExecutor with a max_context_length parameter to automatically trim older context.

  • Implement a Retrieval module like VectorStoreRetrieval to dynamically fetch pertinent information when needed, rather than retaining it indefinitely in the conversation history.

The fundamental concept is to store information outside the central conversation context whenever feasible and manage the context’s length to remain within your model’s limitations.

2 Likes

Ohh thank you I will try this options :slight_smile:

1 Like