New 24h Prompt Caching Retention - Only Certain Models?

New prompt_caching_retention key out today (along with 5.1) - doesn’t say anywhere in the docs that some models don’t support - yet ….

{
“error”: {
“message”: “prompt_cache_retention is not supported on this model”,
“type”: “invalid_request_error”,
“param”: “prompt_cache_retention”,
“code”: “invalid_parameter”
}
}

Are there some models where it doesn’t work??

1 Like

It seems only a few models are supported.

Extended prompt cache retention is available for gpt-5.1 models, including gpt-5.1-codex, gpt-5.1-codex-mini, and gpt-5.1-chat-latest.

source

1 Like

LOL - was that there 2 hours ago?! Thank you…

1 Like