Realtime API error when temperature less than 0.6

When temperature<0.5 the websocket connection will be closed by remote server.
I tried 0.6, 0.7, 0.8, 0.9, 1.0 they all work well.
Theoretically, the temperature setting of a model should not affect its ability to operate stably, correct? I am quite curious about which features of the real-time API require a temperature greater than 0.5.

The “closed” that you mention is an API error, which you should be catching. Temperature below 0.6 simply isn’t allowed.

image

Something about audio is predictive and patterned and will self-train the AI into loops of nonsense that cannot be decoded, so temperature is needed to allow the sampler to randomize from the internal generation.

You can use Chat Completions which accepts lower temperature if you want to see where low temperature goes from uninteresting speech exactly in the style of “voice” training to broken.

Here, just running your question as input at temperature:0, only the first paragraph is heard, and then the audio is broken with silence for the remainder. Yet somehow the transcription continues. That is expected, it seems, needing one to be aware of this audio model behavior and the available range on endpoints that limit it.