The thing to understand here is that function calling introduced a new role for the chat prompt messages (“role”: “function”). To use few-shot examples with chat model prompts you provide a series of alternating (possibly ‘fake’) messages that show how the assistant
did / should respond to a given user
input. With function calling the principle is the same but rather than providing a series of alternating user
-assistant
example messages, you provide alternating user
-function
messages.
e.g.
schema = {
"type": "object",
"properties": {
"object_type": {"type": "string"},
"geometry": {
"type": "array",
"items": {
"type": "number"
}
}
},
"required": ["object_type", "geometry"]
}
example_response_1 = "{\"object_type\": \"point\", \"geometry\": [2.3, 1.0]}\}"
example_response_2 = "{\"object_type\": \"line\", \"geometry\": [[1.0, 2.0], [3.0, 4.0]]\}"
few_shot_function_calling_example = openai.ChatCompletion.create(
model = "gpt-3.5-turbo-0613",
messages = [
{"role": "system", "content": "You are a system for returning geometric objects in JSON."},
{"role": "user", "content": "give me a point"},
{"role": "function", "name": "example_func", "content": example_response_1,},
{"role": "user", "content": "give me a line"},
{"role": "function", "name": "example_func", "content": example_response_2,},
{"role": "user", "content": "give me a polygon"}
],
functions=[{"name": "example_func", "parameters": schema}],
function_call={"name": "example_func"},
temperature=0
)
print(few_shot_function_calling_example.choices[0].message)
{
"content": null,
"function_call": {
"arguments": "{\"object_type\": \"polygon\", \"geometry\": [[0, 0], [0, 5], [5, 5], [5, 0]]}",
"name": "example_func"
},
"role": "assistant"
}