just updated to a paying account on openai to use it in better conditions…
HOWEVER (always), after a little bit of thinking from autogpt, this error message came and stopped automatically all the proccess:
openai.error.InvalidRequestError: This model’s maximum context length is 4097 tokens. However, your messages resulted in 5101 tokens. Please reduce the length of the messages.
Of course this is just the end of the message, cause it all started with something weird written:
“Traceback (most recent call last):
File “”, line 198, in _run_module_as_main
File “”, line 88, in _run_code”
Do you have any ideas how to fix this issue? I’m a newbie here