This example from OpenAI.
How would you do one-shot with the first example?

Like this?

Role: User
Content: what is the weather going to be like in Glasgow, Scotland over the next 5 days

Role: Assistant
Content: None
function_call:{‘name’: ‘get_n_day_weather_forecast’,
‘arguments’: ‘{\n “location”: “Glasgow, Scotland”,\n “format”: “celsius”,\n “num_days”: 5\n}’}},
‘finish_reason’: ‘function_call’}

1 Like

The example in the notebook is a one-shot example.
Or do you mean to omit the ‘system’ role message at all?

To use the function call, you need only a good description of the function and the optional parameters.

In this example, the user asks a question “What is the weather in location XYZ for the next n-th days”. This question should be associable/related to the description of the function. If the user requested message fits with the description of the function call(s), then GPT formulates the function request and send it as an response to your side (backend).
You are then responsible to read out the name, arguments and pass back the result as a function-role to GPT. GPT formulates a message back to the user (frontend).
This is a bidirectional dialogue between the user and you. GPT is the middleman which acts as an API wrapper from NLP → “Your API” → NLP.

Or you can say, GPT translate natural language into a function and the result is translated back into natural language.

You can also force GPT to use a specified function when using the “function_call” key in the request body. It will use the function call even if the user requested content will not fit.
You mostly get a fabricated answer.
It is important to set the temperature for a function call very low to work properly.

But you do not need to use this feature as a function call. You can use it also to extract content from the user message into a well-defined Json blob. Important is, that the descriptions fits with, what the user wants or what you want to have from the user. To use few-shots (in the sense of what I understand from you) is to give one or two examples in the description of the function (what its use-case is for) and the possible parameters. This can be anything, from strings, numbers, enums etc (except arrays).

Is this, what you mean?

Ok, I see. So, let me show a very simple example.

Thanks for the explanation. What I mean is this small example, slightly modified from the OpenAI first example.

functions = [
    {
        "name": "get_current_weather",
        "description": "Get the current weather",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use.",
                },
            },
            "required": ["location", "format"],
        },
    }
]

messages = []
messages.append({"role": "system", "content":"Don't make assumptions about what values to plug into functions. "})
messages.append({"role": "user", "content": "What's the weather like today in Lisbon"})
response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=messages,
    temperature=0.0,
    functions = functions,
    function_call="auto",
    stream = False
)
print(response)

What would I need to do to write a one or two examples? From what your answer, I should give write explicitly in the description? Like this?

functions = [
    {
        "name": "get_current_weather",
        "description": """Get the current weather.

Role: User
Content: How hot is today in London?        

Role: Assistant
Content: {'city':'London','format':'fahrenheit'}
""",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "format": {
                    "type": "string",
                    "enum": ["celsius", "fahrenheit"],
                    "description": "The temperature unit to use. ",
                },
            },
            "required": ["location", "format"],
        },
    }
]

messages = []
messages.append({"role": "system", "content":"Don't make assumptions about what values to plug into functions. "})
messages.append({"role": "user", "content": "What's the weather like today in Lisbon"})
response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=messages,
    temperature=0.0,
    functions = functions,
    function_call="auto",
    stream = False
)
print(response)

Oh, no. You don’t need give examples in the description itself. You can use the description like it is used in the OpenAI example. This is enough to infer the function to be called.
So like in the example to ask the weather in Lisbon should be enough for GPT to use the get_current_weather function.

Btw. I like Portugal. Stayed there about 1 1/2 years in the Algarve as a Digital Nomad during the lockdowns. Very friendly and open hearted people. Nice culture too.

Thanks! Yes, always nice weather in Algarve ;).
Ok, but my point is what if I want to do few-shot for function callings?

1 Like

So you want to call several functions at once in only one request?

I think this is not possible with the chat completion.

Have it understand the way it should answer with throughout multiple examples.

I don’t want to call several functions. I want to do a simple in-context few-shot learning when using function calling feature.

The example I gave. Imagine you want it to do some few-shots so that it return by default “fahrenheit” unit.

1 Like

Ah, I think, I understand it now. So GPT should decides based on the location what to choose “fahrenheit” or “celcius” without the user needs to explicit tell it?

You can do this in the description of the format property.
In this example the description can be changed into “The temperature unit to be used based on the location, e.g San Francisco (fahrenheit), Lisbon (celsius)”

This is a few-shot example the GPT might ensure the current format based on the location of the request. If it fails, you can add more examples in the description.
This should work.

1 Like

I’m working in a more complex problem that I think is not adaptable for the simple example. Do you know if I can find somewhere how they fine-tuned gpt-3.5 for the function_call feature? By knowing the sintaxe they gave the LLM, it would be easier to implement the few-shot for any case.

How they fine-tuned their model to recognize function calling is nowhere documented. Maybe you chunk you task in smaller pieces?

The thing to understand here is that function calling introduced a new role for the chat prompt messages (“role”: “function”). To use few-shot examples with chat model prompts you provide a series of alternating (possibly ‘fake’) messages that show how the assistant did / should respond to a given user input. With function calling the principle is the same but rather than providing a series of alternating user-assistant example messages, you provide alternating user-function messages.

e.g.

schema = {
    "type": "object",
    "properties": {
        "object_type": {"type": "string"},
        "geometry": {
            "type": "array",
            "items": {
                "type": "number"
            }
        }
    },
    "required": ["object_type", "geometry"]
}

example_response_1 = "{\"object_type\": \"point\", \"geometry\": [2.3, 1.0]}\}"
example_response_2 = "{\"object_type\": \"line\", \"geometry\": [[1.0, 2.0], [3.0, 4.0]]\}"

few_shot_function_calling_example = openai.ChatCompletion.create(
    model = "gpt-3.5-turbo-0613",
        messages = [
            {"role": "system", "content": "You are a system for returning geometric objects in JSON."},
            {"role": "user", "content": "give me a point"},
            {"role": "function", "name": "example_func", "content": example_response_1,},
            {"role": "user", "content": "give me a line"},
            {"role": "function", "name": "example_func", "content": example_response_2,},
            {"role": "user", "content": "give me a polygon"}
        ],
    functions=[{"name": "example_func", "parameters": schema}],
    function_call={"name": "example_func"},
    temperature=0
)

print(few_shot_function_calling_example.choices[0].message)

{
  "content": null,
  "function_call": {
    "arguments": "{\"object_type\": \"polygon\", \"geometry\": [[0, 0], [0, 5], [5, 5], [5, 0]]}",
    "name": "example_func"
  },
  "role": "assistant"
}
2 Likes

Very nice! This is exactly what I was looking for. Didn’t realised they added a new role.

Just noticed that it predicted correctly the function arguments but with “assistant” role. Shouldn’t be “function” role?

I think it is because it is the assistant that responds, even if the response is a function_call response as opposed to a content response - i.e. it just follows the chat completion response format explained here: OpenAI Platform.

1 Like

Is there any difference between adding the examples as messages (as you did here) and putting them inline in the system message?

Naively, feels like having 10 fake user messages could bias later responses in the chat.

I don’t know. Something to experiment with maybe.

FWIW, I found that when using the method described in Few-shot and function calling - #15 by lucas.godfrey1000 (passing examples as json-as-string in user/content) confused the model and would frequently result in it giving me javascript and not json.

I had better results when passing examples using the same structure as what the API call returns, e.g

[
                  {
                        "role": "user",
                        "content": f"make terms for {ex_text}",
                    },
                    {
                        "role": "assistant",
                        "content": None,
                        "function_call": {
                            "name": FUNC_NAME,
                            "arguments": json.dumps(example),
                        },
                    },
2 Likes

And furthermore, conversation history can have even more context and logic when it includes the AI’s answer based on that function.

{
"role": "user",
"content": "Post a tweet for me 'AI can affect the outside world!'"
},
{
"role": "function",
"name": "twitter_post",
"content": "Twitter API: message to xxx user account success"
},
{
"role": "assistant",
"content": "It's posted, the world should be able to see your message."
},
 {
"role": "user",
"content": "Help with math. What is sin(333)?"
},
{
"role": "function",
"name": "wolfram_alpha",
"content": "-0.00882 rad"
},
{
"role": "assistant",
"content": "In radians, the answer to sin(333) is -0.00882."
},

That will prevent situations where it could loop asking the same function call countless times or doesn’t know how well it answered. It also will be able to then build on the context of its prior answers in formulating new responses asking about them.

Well this is kind of an odd result, but whenever I try to do few-shot learning, I also noticed that it “forgets” the system prompts and examples.

It’s a bit of a quirk, but I’ve found that if you supply the few shot assistant-role prompts like @cjmungall suggests, the results can be better. However, they are not better if you actually supply the function – as it seems to revert back to forgetting the few shot examples.

To get around this, it seems to work only to provide the examples with the assistant responses, but when you call the API, do not provide the actual function for it to use. This causes GPT to infer the function from the examples(!) Thus it uses the context from the examples to provide the result, without explicitly having the schema.

Because you must provide a function to make the function call, leave function call set to “auto” and provide a null function in the functions argument. If you have been explicit enough, GPT will choose the function from the examples, even though it was not provided as a function.

1 Like