GPT API BUG, Function calling using chat mode in the first action always send the message in the text response;

Só if you use the API chat like gpt-4 0613 or gpt 3.5 0613 and you put a function to be called for any reason and in the first iteration with the chat you trigger the situation that calls the function the gpt wont be able to execute the function, instead 100% chance it ll send the function in the response text .

Exemple.

Prompt+function description
If the person ask about the weather call the function weather 123412 parameter xyz.

Function:
Wheather
description " a function that is called when you want to know about the weather …

Start of the chat:

Input
What is the wheather ?

output: im going to take the wheather information for you
{ weather 123141 xyz}

that output would be the answer text, it does not , never , execute the function if it is needed in the first action, someone know why?

Works dandy. No such symptom, except that gpt-3.5-turbo is so absolutely stupid and broken that it can’t follow the description of a single function parameter and still asks for a city name.

import openai
import json
openai.api_key = "sk-aaoaijf"

function_list = [
    {
        "name": "get_weather",
        "description": "Retrieve weather conditions and forecast",
        "parameters": {
            "type": "object",
            "properties": {
                "city_or_local": {
                    "type": "string",
                    "description": "A US major city. You can just write local if unspecified"
                },
            },
            "required": ["city_or_local"],
        },
    }
]
message_list = [{"role": "system", "content": "You are Answerbot, an AI assistant."},
                {"role": "user", "content": "Weather forecast?"}]
response = openai.ChatCompletion.create(
    model="gpt-4", max_tokens=100, top_p=0.5,
    messages=message_list, 
    functions=function_list, function_call= "auto")
print('--------------------')
print(response)
print('--------------------')
message = response["choices"][0]["message"]
if message.get("content"):
    print(f"AI: {message['content']}")
if message.get("function_call"):
    function_name = message["function_call"]["name"]
    try:
        function_args = json.loads(message["function_call"]["arguments"])
        print(f"Called: function.{function_name}({function_args})")
    except:
        print("bad function")

Output, GPT-4:

Called: function.get_weather({‘city_or_local’: ‘local’})

Output, GPT-3.5-turbo:

AI: Sure, could you please provide me with the name of a city or should I use your current location?

Its in portuguese , doesnt metter. The point is everything is in the right place and 1 in 30~50 times that it tries to call the function it writes the function in the text.

Its good enought but still have this little bug or need to be fintuned.

[
{
“name”: “transfer_att”,
“description”: “A function that is requested by the name transfer_att and Transfer to an attendant id without writing to the customer any message”,
“parameters”: {
“type”: “object”,
“properties”: {
“id”: {
“type”: “integer”,
“description”: “Attendant’s ID which is a number”,
“enum”: [
1,
2,
3,
4,
5,
6
]
}
},
“required”: [
“id”
]
}
},

The function above and the command that i use in the prompt to make it act is
Request the function named transfer_att id 2

“integer” is not a json data type. I don’t know why the function call API didn’t fail your schema on that. “number” is correct for a number. The AI only receives an enum array if the data type is string, though.

The description of the function is poor enough that I wouldn’t be able to suss out the purpose of this function to improve it. Although English can be the best way to instruct GPT-X even with the necessary code switching, I would write it in your native language and see if the objective is better met.

If you don’t need creativity in the language produced, you can reduce top_p as low as 0.001 so that nothing but the most likely tokens will be generated as AI json output.

You can also add more prompt: “Your API tool namespace functions only accept valid JSON”.

The purpose is to work as an customer service bot .

Só there are steps of execution, when is time to transfer the conversation for a human, i write as the prompt above, request the function named…

It works 95%+ of the time. ll try to change the top_p.

If you have any ideia how to get the prompt better i would love it, or the description but this one was the best perfomace util now.