Is possible OpenAI API caching the conversation?

About managing this on our side, we are already doing it here, we really don’t need to send the context again and ‘lose’ tokens with this, perhaps due to some request id or context id, you know?