Can I use logprobs & function calling at the same time?

Hi,

It seems like every time I use function calling, I cannot get the logprobs of the tokens in the response to my API call. Even though I have set logprobs=True and top_logprobs=5, the choices item contains logprobs=ChoiceLogprobs(content=None). When I remove the function from the API call, logprobs are available. Is this a bug or did I do something wrong? Here is my code:

evaluation = client.chat.completions.create(
    model=model_name,
    functions=function,
    function_call={"name": "function_name"},
    messages=[
        {"role": "system", "content": formatted_prompt}
    ],
    logprobs=True,
    top_logprobs=5
)

Thankful for any kind of help.

1 Like

They seem turned off when a function is employed.

The likely reason is that OpenAI doesn’t want the AI method of sending to a tool recipient revealed.

1 Like

Alright, that helps a lot, thank you! Let’s hope they change this in the future :slight_smile:

1 Like

Are there any plans to support logprobs for tools/function calling in the future?

1 Like