GPT-4 returning text-based prompt instead of returning the function call designated to do the same prompt

I’m using Azure OpenAI for implementing a conversational AI using GPT-4 to act as an assistant for resolving user queries. The assistant is supposed to follow a predefined set of instructions, including using provided functions for specific tasks.

The following is a trimmed down version of the prompt that I am giving to GPT-4.

You are an AI assistant for xyz company. Follow the instructions to resolve user queries.

  1. Ask for name and address.
  2. Ask for date of birth.
  3. Ask the user if they want OptionA or OptionB by calling the provided function prompt_user_to_choose_an_option.
  4. Do a search with the chosen option.

I have defined the following function to prompt the user with the options.

{
            "name": "prompt_user_to_choose_an_option",
            "description": "Display the options for the user to choose from (OptionA or OptionB)",
            "parameters": {
                "properties": {
                    "options": {
                        "type": "array",
                        "items": {
                            "type": "string"
                        },
                        "description": "The list of available options for the user to choose from."
                    }
                },
                "required": [
                    "options"
                ],
                "type": "object"
            }
        }

But, in step 3, instead of returning the function call, GPT goes ahead and asks the user with a plain text prompt “Do you want OptionA or OptionB?”, without calling the given function. If the user responds with say, “OptionB”, it then returns the function call which was supposed to be returned in the previous step.

Why does this happen? Why doesn’t GPT return the function call and instead directly prompts the user with a plain text prompt? I want this prompt to happen through the function call because I need to display a specific UI element for the user at this step.

I have tried adding instructions asking GPT to never use a text-based prompt when asking the user for options and use the function instead. That works sometimes but is not consistent. Is there a proper, consistent solution to this problem?

Hm. I think you have to sharpen your instructions. It reads a bit ambiguously and I interpreted the instructions in the same way as the GPT did in your example :slight_smile:

Can you elaborate a little bit more on what you are trying to achieve? Perhaps then I or someone else can provide further input on enhancing the instructions.