API Temperature Change from 0 to 1 to 0 to 2 in 'Playground'

I noticed a change recently with the temperature parameter when using GPT-4. In the API documentation, it states that temperature can be adjusted between 0 and 2, with a default of 1. However, in the GPT-4 playground, the temperature could only be adjusted between 0 and 1 for a while, with a default of 0.7 until recently, when the slider allowed it to go from 0 to 2.

Did OpenAI recently change the range and default for temperature when using the API directly?

I’m curious if:

  1. The temperature scale has always been different between the playground and the API, in which case I overlooked that detail.

  2. OpenAI updated the model itself, necessitating a change in how temperature is represented.

  3. Only the playground interface was updated, to simplify the temperature range for users. The API retains the original 0 to 2 scale.

  4. Something else is going on that I’m missing!

Want to make sure I have the correct understanding so I can apply temperature properly.



The API section about this; not sure if this was recently changed or not:

1 Like

The range has been around for a while. Since I’ve been around anyways, which was for Davinci. It’s just kind of … very … creative/insane at temperatures higher than 1.

1.2 was a nice sweet spot for some very visual story telling until it went incoherent.

I’ve seen some people claim that they can actually understand the word vomit. Others say that you can feed back the madness to GPT and it can decipher it.

To answer your question: It’s been around for a while, just not in the playground (until now)


It definitely has not been around for a while (on the Playground at least). I use Davinci and gpt-3.5-turbo constantly, every day, and it’s always defaulted to 0.7, with a max temperature of 1. Now it’s defaulting to 1, which makes me wonder if they changed how the scaling works.


Yeah, I’m sure scaling changed. 1/2 seems to be the old 0.7/1 and 2/2 seems to be the old 1.X/1.
I’m also doing experimentations with 2/2.

I tried compressing text with 2/2 and decipher it using 1.x/2, but nothing worked for now.

@RonaldGRuckus do you have some sources to prove that? Or do you know people that succeed doing that?

That’s what I said

No sources, just what people would say in the Discord. Personally, I don’t think it’s possible.

1 Like

Anything definitive on scaling from the team yet?

1 Like