hey all,
I’ve been playing around a little bit with ChatGPT and I don’t seem to be hitting the same token limits. Has OpenAI established a way for the bot to have longer term memory than the token limit?
hey all,
I’ve been playing around a little bit with ChatGPT and I don’t seem to be hitting the same token limits. Has OpenAI established a way for the bot to have longer term memory than the token limit?
Hi, the question is maybe related to my own: How to force to continue a truncated completion?
There are some techniques to allow having longer term memory than the token limit.
some approaches I saw implemented by many, e.g. see @daveshapautomator youtube channel:
These are great! I’ll definitely take a look!
(Funny enough I’m seeing this right after replying to your other post!)