ChatGPT API Memory Retention Capabilities

When I first call the API with a long question that has context and then enter a second question, it retains that context. What are the limits of this? Does it still remember context after I ask enough questions to exceed the 4k token length? What if I enter context in the first question that is 3k in length and follow up with another question that has context 3k in length. Can it handle both?


Can we avoid this memory somehow? If we use the API to service multiple users, I don’t want memory of one end user’s conversation to somehow bleed into another user’s response.

I feel that it’s necessary to check the code of the content being sent. Some API programs may remember the previous context, while others can choose to send each request separately. However, if it exceeds the limit, it definitely won’t be able to “remember”.

Actually, many people hope that APIs can remember.