I’m using OpenAI’s API with the following settings:
• temperature=0
• top_p=0
From my understanding, setting the temperature and top p to 0 should make the outputs deterministic, meaning it should always return the same result given the same input. However, I am still observing some randomness in the outputs. Additionally, I want to set top_k explicitly, but I don’t see an option for it in OpenAI’s API. Is there any workaround for setting top_k, or am I misunderstanding the way the API handles sampling?
Could anyone explain why randomness persists with these settings and if there’s a way to achieve complete determinism?
If my understanding is correct, there is no randomness in neural network,
only in sampling process in LLMs. Then I believe top p = 0 removes randomness completly.