I am using a variety of OAI models for my project and heavily using function calling to get structured / corralled responses.
I have noticed that when I have a function that has an enum as an expected response, and “gpt-4-0125-preview” is given a short input (i.e. a string of < 5 chars), it just returns the input rather than a value from the enum. GPT 3.5 turbo and the older gpt 4 turbo do not exhibit these same errors.
Has anyone else experienced this?