COMPLAIN: gpt3.5-turbo got Way-Less Intelligent Than Before


He does not understand prompt at all.
No matter how many times I try, those are the results:

Hello! How can I assist you today?
Hi! How can I assist you?

Moreover, I am not able to find “gpt3.5-turbo-0613” model.

1 Like

Here’s another try, failed dramatically.

1 Like

I implemented your scenario using the new function calling capability of Chat API.

const messages = [
            { role: "user", content: "I want Jack Bauer as soon as possible . Send the object to Los Angeles Stadium at Hollywood Park." }
]

const response = await openai.createChatCompletion({
            model: "gpt-3.5-turbo-0613",
            messages, 
            functions: [
                {
                    name: "find",
                    description: "Find a person's location using person's name.",
                    parameters: {
                        type: "object",
                        properties: {
                            name: {
                                type: "string",
                                description: "The name of the person, e.g. John, Jane"
                            }
                        },
                        required: ["name"]
                    }
                },
                {
                    name: "deliver",
                    description: "Deliver an object to location.",
                    parameters: {
                        type: "object",
                        properties: {
                            location: {
                                type: "string",
                                description: "The location to deliver the object, e.g. Los Angeles, CA"
                            }
                        },
                        required: ["location"]
                    }
                }
            ]
 })

The response I get only shows the deliver function.

{
  role: 'assistant',
  content: null,
  function_call: {
    name: 'deliver',
    arguments: '{\n  "location": "Los Angeles Stadium at Hollywood Park"\n}'
  }
}

If I changed the message prompt by removing the last sentence:

const messages = [
            { role: "user", content: "I want Jack Bauer as soon as possible. " }
]

The response is

{
  role: 'assistant',
  content: null,
  function_call: { name: 'find', arguments: '{\n"name": "Jack Bauer"\n}' }
}
1 Like

“call functions using following JSON format” worked very well in the past.

Thanks for your short documentation, what do you think about this new feature?
I prefer the old way actually.

It looks neater. The format is simple enough to write compared to writing your own prompts.

Yeah, you have your point though!
Maybe they made the new way behaves better than the old way, and the old way now performs badly.