Prompt_cache_retention not being recognized as a valid argument

I am trying to use the new prompt caching controls with the recently released gpt-5.1 model, specifically the prompt_cache_retention parameter that OpenAI recently documented as a way to extend the duration of the cache for up to 24 hours.

However, when I pass prompt_cache_retention="24h" to the Responses API, I get this error: TypeError: Responses.create() got an unexpected keyword argument ‘prompt_cache_retention’

The call works correctly if I remove this argument.

Any guidance or working example of prompt_cache_retention with gpt-5.1 in the Responses API would be appreciated.

You are not getting an API error.

You are getting the OpenAI SDK library not letting you pass new parameters.

Solution: update library in your environment

Better: code API calls without “import bloat”, and send whatever you want without needing new openai module installation weekly.

1 Like

That was embarrassing… will switch immediately, thank you.

1 Like