I really like the using temperature > 0 when I am generating lists, as I can use repeated API calls to generate lists of unlimited length.
Is there a way I can do this and still have deterministic outputs, e.g. by setting a “random seed” for the randomization introduced into the model?
1 Like
joey
2
Hi there, that’s an interesting idea, but unfortunately it’s not possible at this time. When you set top_p to 0, you’ll get the same completions, even with a high temperature, but I assume that’s not really what you’re looking for.
1 Like
If you are looking for random seeding the prompt, you may consider building some random (or structured/conditional) generator in your UI stack (python, JavaScript, etc) that calls the GPT endpoints
that either
a) selects from some pre-determined list of seed topics
b) if you are an over-engineer like me, then call a second GPT endpoint for a (depending on the temperature, nearly truly) random topic 
Good luck!
Excelsior!
Drew
1 Like
thanks for the suggestions, I think that together they will solve my problem!
1 Like
Alan
5
I tried experimenting with this. You can insert some random characters in the prompt that do seem to act as a random seed and therefore change the output that you get. Here’s the saved prompt I tried it with:
Basically, just put a bunch of random characters into the Question number (see prompt below). And you get a different response [edited: sometimes] when you ask the same question even when temperature is set to zero. Prompt:
Question: BB2922B43
Here’s a list of discussion topics for chatrooms:
1)
joey
6
I wasn’t able to replicate that. I tried around 6 generations on 5 different questions, and in those 30 experiments, I never got a different response within the same prompt.
2 Likes
Alan
7
Sorry about that. I tried it twice and got two different results, so I assumed it would keep generating different ones over and over. It seems that it doesn’t. I did get the two different results with two tries, but unfortunately, with further tries they’re the ONLY two results I got. …back to the drawing board.

3 Likes
Interesting – I incremented up each character of that Question to see if that changes the seed, but I got the same question both times. “What is the best way to get a job in the video game industry?”
Upping the temperature (to 0.25) gave me the same response as above the first time and then two different questions the subsequent time:
Excelsior!
Drew
1 Like
@sridhar1 - To some extent it defeats the point of temp 0 (max likelihood) and you may want to instead experiment with finding the threshold top_p and temperature which just about generate variations.
For temperature 0, you can try prepending generation with random numbers or neutral texts. This will influence completion. You can also randomize a neutral starting word for each list item then complete with temp 0.
Alan
11
Yeah, sorry about the fail on this one. I’ll try more tests before sharing next time.
1 Like
art
12
A seed parameter would be actually super helpful!
Getting deterministic outputs is a very nice feature to have, product-wise.
3 Likes
It’s now possible to set a seed parameter: