Logit_bias not working as expected

I explained why you don’t receive affected logprobs. The paper is Stealing Part of a Production Language Model.

OpenAI turned off the stealing. There is now a path to return a softmax result without logit_bias (and never included temperature).

I made a sampling diagram to imagine what’s going on.

Also removed from your view is the certainty that the AI wants to call a function with its first token. A token that your promotion of other tokens cannot affect to influence or reign in function-calling. And all logprobs disabled when that function is invoked.

2 Likes