Few-shot and function calling

Hi,
Is there any examples or guides on how should I do few-shot when using function calling?

I’m using function calling to extract structured data from the prompt but for some examples the 3.5 don’t give very good answers. I would like to try to do a bit of few-shot but not sure what examples I should give it because I’m expecting a function_call.

What content should I give as an example? An example of the full function structure that we give as parameter to openai.ChatCompletion.create?

3 Likes

Great free courses on prompting for Developers.

May you provide an example?
This could help better understand your use case.

This example from OpenAI.
How would you do one-shot with the first example?

Like this?

Role: User
Content: what is the weather going to be like in Glasgow, Scotland over the next 5 days

Role: Assistant
Content: None
function_call:{‘name’: ‘get_n_day_weather_forecast’,
‘arguments’: ‘{\n “location”: “Glasgow, Scotland”,\n “format”: “celsius”,\n “num_days”: 5\n}’}},
‘finish_reason’: ‘function_call’}

1 Like

The example in the notebook is a one-shot example.
Or do you mean to omit the ‘system’ role message at all?

To use the function call, you need only a good description of the function and the optional parameters.

In this example, the user asks a question “What is the weather in location XYZ for the next n-th days”. This question should be associable/related to the description of the function. If the user requested message fits with the description of the function call(s), then GPT formulates the function request and send it as an response to your side (backend).
You are then responsible to read out the name, arguments and pass back the result as a function-role to GPT. GPT formulates a message back to the user (frontend).
This is a bidirectional dialogue between the user and you. GPT is the middleman which acts as an API wrapper from NLP → “Your API” → NLP.

Or you can say, GPT translate natural language into a function and the result is translated back into natural language.

You can also force GPT to use a specified function when using the “function_call” key in the request body. It will use the function call even if the user requested content will not fit.
You mostly get a fabricated answer.
It is important to set the temperature for a function call very low to work properly.

But you do not need to use this feature as a function call. You can use it also to extract content from the user message into a well-defined Json blob. Important is, that the descriptions fits with, what the user wants or what you want to have from the user. To use few-shots (in the sense of what I understand from you) is to give one or two examples in the description of the function (what its use-case is for) and the possible parameters. This can be anything, from strings, numbers, enums etc (except arrays).

Is this, what you mean?

Ok, I see. So, let me show a very simple example.

Thanks for the explanation. What I mean is this small example, slightly modified from the OpenAI first example.

functions = [
    {
        "name": "get_current_weather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use.",
                },
            },
            "required": ["location", "format"],
        },
    }
]

messages = []
messages.append({"role": "system", "content":"Don't make assumptions about what values to plug into functions. "})
messages.append({"role": "user", "content": "What's the weather like today in Lisbon"})
response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=messages,
    temperature=0.0,
    functions = functions,
    function_call="auto",
    stream = False
)
print(response)

What would I need to do to write a one or two examples? From what your answer, I should give write explicitly in the description? Like this?

functions = [
    {
        "name": "get_current_weather",
        "description": """Get the current weather.

Role: User
Content: How hot is today in London?        

Role: Assistant
Content: {'city':'London','format':'fahrenheit'}
""",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use. ",
                },
            },
            "required": ["location", "format"],
        },
    }
]

messages = []
messages.append({"role": "system", "content":"Don't make assumptions about what values to plug into functions. "})
messages.append({"role": "user", "content": "What's the weather like today in Lisbon"})
response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=messages,
    temperature=0.0,
    functions = functions,
    function_call="auto",
    stream = False
)
print(response)

Oh, no. You don’t need give examples in the description itself. You can use the description like it is used in the OpenAI example. This is enough to infer the function to be called.
So like in the example to ask the weather in Lisbon should be enough for GPT to use the get_current_weather function.

Btw. I like Portugal. Stayed there about 1 1/2 years in the Algarve as a Digital Nomad during the lockdowns. Very friendly and open hearted people. Nice culture too.

Thanks! Yes, always nice weather in Algarve ;).
Ok, but my point is what if I want to do few-shot for function callings?

1 Like

So you want to call several functions at once in only one request?

I think this is not possible with the chat completion.

Have it understand the way it should answer with throughout multiple examples.

I don’t want to call several functions. I want to do a simple in-context few-shot learning when using function calling feature.

The example I gave. Imagine you want it to do some few-shots so that it return by default “fahrenheit” unit.

1 Like

Ah, I think, I understand it now. So GPT should decides based on the location what to choose “fahrenheit” or “celcius” without the user needs to explicit tell it?

You can do this in the description of the format property.
In this example the description can be changed into “The temperature unit to be used based on the location, e.g San Francisco (fahrenheit), Lisbon (celsius)”

This is a few-shot example the GPT might ensure the current format based on the location of the request. If it fails, you can add more examples in the description.
This should work.

1 Like

I’m working in a more complex problem that I think is not adaptable for the simple example. Do you know if I can find somewhere how they fine-tuned gpt-3.5 for the function_call feature? By knowing the sintaxe they gave the LLM, it would be easier to implement the few-shot for any case.

How they fine-tuned their model to recognize function calling is nowhere documented. Maybe you chunk you task in smaller pieces?

The thing to understand here is that function calling introduced a new role for the chat prompt messages (“role”: “function”). To use few-shot examples with chat model prompts you provide a series of alternating (possibly ‘fake’) messages that show how the assistant did / should respond to a given user input. With function calling the principle is the same but rather than providing a series of alternating user-assistant example messages, you provide alternating user-function messages.

e.g.

schema = {
    "type": "object",
    "properties": {
        "object_type": {"type": "string"},
        "geometry": {
            "type": "array",
            "items": {
                "type": "number"
            }
        }
    },
    "required": ["object_type", "geometry"]
}

example_response_1 = "{\"object_type\": \"point\", \"geometry\": [2.3, 1.0]}\}"
example_response_2 = "{\"object_type\": \"line\", \"geometry\": [[1.0, 2.0], [3.0, 4.0]]\}"

few_shot_function_calling_example = openai.ChatCompletion.create(
    model = "gpt-3.5-turbo-0613",
        messages = [
            {"role": "system", "content": "You are a system for returning geometric objects in JSON."},
            {"role": "user", "content": "give me a point"},
            {"role": "function", "name": "example_func", "content": example_response_1,},
            {"role": "user", "content": "give me a line"},
            {"role": "function", "name": "example_func", "content": example_response_2,},
            {"role": "user", "content": "give me a polygon"}
        ],
    functions=[{"name": "example_func", "parameters": schema}],
    function_call={"name": "example_func"},
    temperature=0
)

print(few_shot_function_calling_example.choices[0].message)

{
  "content": null,
  "function_call": {
    "arguments": "{\"object_type\": \"polygon\", \"geometry\": [[0, 0], [0, 5], [5, 5], [5, 0]]}",
    "name": "example_func"
  },
  "role": "assistant"
}
5 Likes

Very nice! This is exactly what I was looking for. Didn’t realised they added a new role.

Just noticed that it predicted correctly the function arguments but with “assistant” role. Shouldn’t be “function” role?

I think it is because it is the assistant that responds, even if the response is a function_call response as opposed to a content response - i.e. it just follows the chat completion response format explained here: OpenAI Platform.

1 Like

Is there any difference between adding the examples as messages (as you did here) and putting them inline in the system message?

Naively, feels like having 10 fake user messages could bias later responses in the chat.

I don’t know. Something to experiment with maybe.