Pricing when requesting logprobs to calculate Perplexity Score

Hi everyone! I am requesting and storing the logprobs on each request with top_logprobs=5, i read that this can increase the cost of the api call because it constitute a “advanced usage”.

Can anyone tell me if this is true? i found no resource on OpenAI regarding this issue.

Thanks.

Logprobs are simply part of the API response if you turn them on with the two required parameters, and do not cost anything. They do not take extra computation (except for OpenAI’s filtered version not being what the sampler actually uses but a different calculation that misrepresents the chance of special tokens that might invoke tools or stops.)

Wherever you read that information is incorrect.

AI is powerless to resist a leading question with falsehoods, and produces mere justification:

1 Like