I noticed a change recently with the temperature parameter when using GPT-4. In the API documentation, it states that temperature can be adjusted between 0 and 2, with a default of 1. However, in the GPT-4 playground, the temperature could only be adjusted between 0 and 1 for a while, with a default of 0.7 until recently, when the slider allowed it to go from 0 to 2.
Did OpenAI recently change the range and default for temperature when using the API directly?
The range has been around for a while. Since I’ve been around anyways, which was for Davinci. It’s just kind of … very … creative/insane at temperatures higher than 1.
1.2 was a nice sweet spot for some very visual story telling until it went incoherent.
I’ve seen some people claim that they can actually understand the word vomit. Others say that you can feed back the madness to GPT and it can decipher it.
To answer your question: It’s been around for a while, just not in the playground (until now)
It definitely has not been around for a while (on the Playground at least). I use Davinci and gpt-3.5-turbo constantly, every day, and it’s always defaulted to 0.7, with a max temperature of 1. Now it’s defaulting to 1, which makes me wonder if they changed how the scaling works.