`` “message” : “Unsupported parameter: ‘temperature’ is not supported with this model.”
```
is now a thing with GPT-5 models (as they’re reasoning models) - how does that work when it might get handed off to a “non-thinking” model (as is done in ChatGPT)? Or is this just not a thing anymore unless we’re specifically reaching out to 4.1
This is a breaking change and absolutely unexpected, I really hope the add it as a significant number of real-world applications rely on more deterministic responses.
You experience the same behavior with other API reasoning models and attempts at top_p or temperature. You can send the defaults=1 successfully. Divergence from that will fail validation.
The playground is what is exposing an error an API programmer would rarely make with this understanding. You can easily switch model there, and be shown a call that will fail. I expect this would be tweaked up as things settle in.
One has to assume some internal task-based self-adjustment replacing your desired control.
max_tokens is not remapped either, because max_completion_tokens reflects a budget and not just what you receive.