OpenAI Function Calling - function_call="auto"

Below is my function calling code and the schema.
Notice that I have put function_call=“auto”.
Per the documentation, this means that the model must freely pick between generating a message or a picking a function based on user prompt.

response = openai.ChatCompletion.create(
    deployment_id="gpt-35-turbo-0613",
    messages=messages,
    functions=functions,
    function_call="auto", 
)
[
  {
    "name": "search_hotels",
    "description": "Retrieves hotels from the search index based on the parameters provided",
    "parameters": {
      "type": "object",
      "properties": {
        "location": {
          "type"       : "string"                                      ,
          "description": "The location of the hotel (i.e. Seattle, WA)"
        },
        "max_price": {
          "type"       : "number"                         ,
          "description": "The maximum price for the hotel"
        },
        "features": {
          "type": "string",
          "description": "A comma separated list of features (i.e. beachfront, free wifi, etc.)"
        }
      },
      "required": ["location"]
    }
  },
  {
    "name": "search_movies",
    "description": "Get Movie Name of a given actor, for an given year.",
    "parameters": {
      "type": "object",
      "properties": {
        "actor": {
          "type"       : "string"                                 ,
          "description": "The Name of the actor (i.e. Will Smith)"
        },
        "year": {
          "type"       : "number"                      ,
          "description": "The 4 digit year (e.g: 2001)"
        }
      },
      "required": ["actor", "year"]
    }
  },
  {
    "name": "default_function",
    "description": "This is the default function when none matches.",
    "parameters": {
      "type": "object",
      "properties": {
        "context": {
          "type": "string",
          "description": "Give the context of the user input in less than 15 words."
        }
      },
      "required": ["context"]
    }
  }
]

When the user prompt is
"what are the best hotels around Manhattan under $300" .
The out put is below

{
  "function_name": "search_hotels",
  "parameters": {
    "location":"Manhattan",
    "max_price": 300
  }
}

When the user prompt is
“What are the action movies of Will Smith in year 2015?”
The output is below

{
  "function_name": "search_movies",
  "parameters": {
    "actor":"Will Smith",
    "year": 2015
  }
}

The model has no problem choosing the correct function, since the prompts are clear in above scenarios.

Below is where the problem arrives.

When the user prompt is
What are the budget friendly hotels in Manhattan where Will Smith stayed in 2015 for his action movie shooting?

Since the model is not able to determine the function, it is generating a message given below (since function_call=“auto” is set). Is there a way to find out which function the model is considering to generate this response. Is it search_hotels, search_movies or default_function

Let’s assume the response from the model is
Please provide some more information to better assist you with the required information.

Did the model considered search_hotels, search_movies or default_function to generate above response. I want to understand to which function the model is leaning to.

1 Like