Temperature docs for 1 max vs 2 max




Defaults to 1

What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

We generally recommend altering this or top_p but not both.

Sounds like the higher value of 0.8 was from the 1 max scale? I’ve found anything over 1.0 gets a bit crazy quickly…

The temperature is the same as it ever was, although different models may have different amounts of apparent “creativity” with the word possibilities they generate at a particular setting.

And yes, a good reason to have some documentation recommend maximum temperature of 1.0 (which is actually logits without alteration), is because of the nonsense output and lack of ability for the AI to censor itself at higher settings.

There’s no technical reason why the AI model couldn’t run at higher than 2 (as the API restriction), but there is also no practical reason either.

1 Like

OpenAI originally allowed us to use 0 to 1… now it’s 0 to 2… But yeah, just pointing out the .8 should prob be 1.0…

Not a biggie, but I remember answering questions about it more than once.