How to enable temporary mode (memory less prompting) in API

Hello,

Is there a way to enable the “temporary chat mode” when querying a mode (specifically o1-mini)?

As I am working on a research project, I need to avoid data leakage, otherwise the “test” dataset would already have been seen by the model.

Thanks a lot

1 Like

There’s no specific “temporary mode” toggle. By default, the OpenAI API doesn’t store conversation history between calls. To go memory-less, don’t include previous messages in the request. Each request then becomes a standalone interaction with no past context.

1 Like

In addition to the chat completions endpoint AI model being stateless and therefore only knowing what you send it (and not learning from anything sent), on the API, you must manually opt-in to sharing API training data with OpenAI in your platform account. Thus, by default, any held-out set that you want to keep private for future evaluation also won’t be used for improving models.