How to create an assistant message with an example function call?

I’m trying to implement in-context learning using an assistant. I came across this thread (Few-shot and function calling - #15 by lucas.godfrey1000), but the code in the thread uses openai.ChatCompletion.create rather than client.beta.threads.messages.create. However, the client.beta.threads.messages.create function expects only 2 arguments, and I can’t figure out how to pass the “function_call” key and value to the assistant.

Here is a trimmed example of my code:

# Define schema for assistant function
schema = {
    "type": "object",
    "properties": {
        "property1": {
            "type": "array",
            "items": {
                "type": "string",
            },
            "description": f"description of property"},
    },
    "required": ["property1"]
}

# Create an assistant to run the function
assistant = client.beta.assistants.create(
    name="assistant name",
    instructions="You are an information retrieval agent that will extract specific properties from text I provide you...",
    tools=[{
        "type": "function", 
        "function": {
           "name": "extract_information", 
           "parameters": schema
         }
    }],
    model="gpt-4-turbo-preview",
)

# Create an example user message calling the function extract_information
client.beta.threads.messages.create(thread.id, {
    "role": "user",
    "content": f"Extract information from the following text: \n\n {text goes here}"
})

# Create an example response to the user's input for in-context learning
client.beta.threads.messages.create(thread.id, {
    "role": "function",
    "content": "",
    "function_call": {
        "name": "extract_information",
        "arguments": {"property1": ["example1", "example"2]}
    }
})

You cannot place messages that OpenAI doesn’t want you to into an assistant’s thread.

You are a guardrailed user, not a developer, when you use assistants.

Here’s how you send back an actual tool call return only when the AI emits a tool call with run status:

https://platform.openai.com/docs/api-reference/runs/submitToolOutputs

1 Like

This is an example of using OpenAI Assistant API: npi/examples/openai/assistant.py at dev · npi-ai/npi · GitHub.

Assistant API needs an event handler to process function calling request, it’s an async way.

For using In-context leaning for improving of selecting function, have you tried add few-shots into instructions when ‘client.beta.assistants.create’ ?

The few-shot examples are too lengthy to fit into the context of an assistant’s system message. Regardless, I’m not sure if the assistant would be able to infer that the behavior in an example passed in a system/user message should be replicated in the function I’m calling.