New Response Pattern: Text + Function Calls

Has anyone else noticed the change in how the API handles function calls now?

Previously:

User: "What's the weather?" 
Assistant: {tool_call}

Now:

User: "What's the weather?" 
Assistant: "Let me check the weather for you." + {tool_call}

This seems like a significant UX change - the model now provides natural language responses alongside function calls rather than just returning the function call alone. Is this an intentional design change? Curious about the reasoning behind it and if there are best practices for handling this new pattern in applications.

This is not a new ability. You can read me demonstrating recently for those that would try to pick one type of output or the other.

The amount of instruction needed to evoke the behavior of “context” + “tool_calls” depends on the model. Depicted was gpt-4-turbo specifically to show this.

2 Likes

Hey,

Thanks for the explanation. Is there any way to make the AI always execute the tool call regardless without generating any text before?

Thanks.

The only guaranteed way is to force a function to always be called by API parameter, which means the AI has no decision making to do of when the function would be useful.

Otherwise, you can just provide the opposite function description and see if the AI model you are using will follow along, something like “this function is always called immediately and silently if useful, without informing the user of your intention to use it.”