GPT-3. How to reset context length after error?

Hi,
I am creating a telegram chatbot and receiving an error about exceeding max context length after a certain conversation time. Any idea how to reset the context length? Is there a standard for this? Thanks!

Error: openai.error.InvalidRequestError: This model’s maximum context length is 2049 tokens, however you requested 2060 tokens (1910 in your prompt; 150 for the completion). Please reduce your prompt; or completion length.

2 Likes

hey

are you able to resolve this problem?

I am facing the same issue

I would say do what the error message says. Reduce your input prompt by truncating the older messages.

Be proactive at estimating your input prompt and keep it under a certain level. You can do this by the estimate of W = T/sqrt(2), where W is the number of English words and T is the number of tokens.

In your case, T = 1800 (or 1700 for more margin). If you use 1700 tokens, then this is 1200 words. If you count more than 1200 words, then drop the older history until it fits to less than 1200 words.

I have attempted the suggested solution of reducing the prompt, but unfortunately, it did not resolve the issue as the same error message persists. The main concern now is how to clear the chat history in order to empty our token queue. We have made multiple attempts to address this problem but have not found a successful resolution. While acquiring additional tokens is a possibility, the challenge lies in figuring out how to clear the chat history and restore an empty token queue.