I am trying to use the new prompt caching controls with the recently released gpt-5.1 model, specifically the prompt_cache_retention parameter that OpenAI recently documented as a way to extend the duration of the cache for up to 24 hours.
However, when I pass prompt_cache_retention="24h" to the Responses API, I get this error: TypeError: Responses.create() got an unexpected keyword argument ‘prompt_cache_retention’
The call works correctly if I remove this argument.
Any guidance or working example of prompt_cache_retention with gpt-5.1 in the Responses API would be appreciated.