Does prompt caching persist between different models?

I have a long document that I am using as context for a series of prompts.

I would like to use 4o-mini early in the sequence, followed by 4o later in the sequence for more challenging tasks. Will the prompt cache persist between models, or only within queries to the same model?

Sadly, no — prompt caching only works within one model right now. The prompt is cached to a specific model and isn’t able to be shared between models. It would be a new prompt for a different model.

5 Likes