Temperature in GPT-5 models

Seeing as how:

``
“message” : “Unsupported parameter: ‘temperature’ is not supported with this model.”
```
is now a thing with GPT-5 models (as they’re reasoning models) - how does that work when it might get handed off to a “non-thinking” model (as is done in ChatGPT)? Or is this just not a thing anymore unless we’re specifically reaching out to 4.1

5 Likes

Yes, just ran into this myself. My guess is “this is just not a thing anymore,” but perhaps we’ll see.

Is it because of maintaining a stable reasoning? Will they add the ability of temperature being a parameter in the future?

Will they add the ability of temperature being a parameter in the future?

Don’t count on it. Temperature and top_p parms are not needed for GPT-5 models.

1 Like

This is a breaking change and absolutely unexpected, I really hope the add it as a significant number of real-world applications rely on more deterministic responses.

I am facing the same issue! But it works with the default value of 1.0 but no option to vary it.
Does anyone else experience the same?

You experience the same behavior with other API reasoning models and attempts at top_p or temperature. You can send the defaults=1 successfully. Divergence from that will fail validation.

The playground is what is exposing an error an API programmer would rarely make with this understanding. You can easily switch model there, and be shown a call that will fail. I expect this would be tweaked up as things settle in.

One has to assume some internal task-based self-adjustment replacing your desired control.

max_tokens is not remapped either, because max_completion_tokens reflects a budget and not just what you receive.

1 Like