Questions regarding API sampling parameters (temperature, top_p)

Once one understands how the two parameters interplay, as I described in the first post, then one can use an intermix creatively, and there is no prohibition on using both at the same time.

I would encourage considering some top_p reduction, because that removes the lottery-winning 0.001% token choices that make no sense or break code.


A setting of the two parameters that is disparate and initially non-intuitive such as temperature = 2, top_p = 0.25 can satisfy creative writing desires - while making code output formatting of that same creative writing less breakable.

AI output? {“poem_composition”, “The beauty of nature is a sight to behold\n
A sight that’s so pure and so grand…”

When “{” = 50% by your instruction quality, there is no other possible logit to break your JSON. However creativity that doesn’t go off the rails is still possible, including:

sight = 10.91%
wond = 5.19%
wonder = 4.48%
place = 2.64%
tranquil = 2.62%

And high temperature gets “creative” by making them all similar probability of being chosen, when there is this ambiguity allowed.

1 Like