Need help, function call falsely selected

We have a self-trained llm dedicated for statistical questions about “A” (a company)
I am using gpt-3.5-turbo function calling feature to route questions about A to my own llm, and chatgpt should answer other questions (unrelated to A) itself.
here is a sample of my function descriptions :

"function_description": {
  "name": "quick_search_A_statistics",
  "description": "Search for any statistics of company A",
  "parameters": {
    "type": "object",
    "properties": {
      "query": {
        "type": "string",
        "description": "Full question in human language"
      },
      "reason_of_calling": {
        "type": "string",
        "description": "The reason for calling this function"
      }
    },
    "required": [
      "query",
      "reason_of_calling"
    ]
  }

Problem: When i ask it “what’s B’s daily income?”,it still triggers the above function call, i sense it has something to do with my prompt, i even added a ‘reason_of_calling’ parameter but it is not helping.

The system prompt is quite simple

Don't make assumptions about what values to plug into functions. 
Ask for clarification if a user request is ambiguous.

Would appreciate any help, many thanks in advance.

1 Like

This type of question, and others about what invokes a function, always comes down to a statement I coined:

“The AI doesn’t know what it does or doesn’t know”

There might be fine tuning that shows examples of “What’s the capital of France”, but nothing like all possibilities a user could input.

So you have to tell it, prompting. This is an example for your benefit, not necessarily the AI:

  • You have been trained on a broad knowledge corpus of all topics and companies you can answer from.
  • The only thing you don’t know is our “B Company” founded in 2022. We provided a B Company knowledge base function you can call to inform your answers.
1 Like

Thanks for the help.
In the example you provided, do you mean, in system prompt, i should instruct it to ‘forget’ any prior knowledge about company A and should always seek help from functions i provided.

The problem i am having is not that the function is not invoked, but when i am asking for statistics about B, it still invoked the function designed for A. I am confused but i don’t think its the same thing.

Yes, in the system prompt. It is hard to dictate the entire operation of an AI just by a function description.

In the prompt we tell it that your company is “B” (maybe the opposite of your example) and it doesn’t know about it. However, it does know about everything else, and doesn’t need the function.

For clarity, actual examples within example:


:clipboard: Function description:
ChatGPT information

:clipboard: Prompt:
ChatGPT was released at the end of 2022, newer than your knowledge. Use function call to retrieve ChatGPT statistics and data.
You can answer all other questions simply from your training corpus knowledge, without a function.


Your “reason_for_calling” parameter seems ambiguous - If I was an AI, it would likely always be “because I don’t know and was told I need to use this function!”

1 Like

Thanks again for the quick reply.
Unfortunately the system instruction is not working as expected, GPT is still consistently invoking quick_search_A_statistics() on topics other than A.

However i tried a different approach, by adding a new function ‘get_other_information’, described as ‘Get information unrelated to A’, and when i ask about B it invoke this new function. I can settle for this i guess, for now, but it is an additional api cost.

Seems to me the model is very eager to call any function available.

Well, your AI might be dumb at function calling, but it’s not this dumb:

image

(or actually it is, and is the same engine, but you can do better)

1 Like

It is significantly improved after i put more detailed instructions in system prompt, thank you.

1 Like