Hi,
I am creating a telegram chatbot and receiving an error about exceeding max context length after a certain conversation time. Any idea how to reset the context length? Is there a standard for this? Thanks!
Error: openai.error.InvalidRequestError: This model’s maximum context length is 2049 tokens, however you requested 2060 tokens (1910 in your prompt; 150 for the completion). Please reduce your prompt; or completion length.
2 Likes
hey
are you able to resolve this problem?
I am facing the same issue
I would say do what the error message says. Reduce your input prompt by truncating the older messages.
Be proactive at estimating your input prompt and keep it under a certain level. You can do this by the estimate of W = T/sqrt(2), where W is the number of English words and T is the number of tokens.
In your case, T = 1800 (or 1700 for more margin). If you use 1700 tokens, then this is 1200 words. If you count more than 1200 words, then drop the older history until it fits to less than 1200 words.