New API changes on Chat Completion broke some previous working stuff

Dear OpenAI Community,

I’m writing to share and seek advice on a specific issue we’ve encountered with the recent changes to the Chat Completion API, which has significantly affected our application’s functionality.


Our application heavily relies on the Chat Completion API, particularly a feature where certain function calls would initially return a status message like in progress. This was integral to our user experience, as it informed users that their request was being processed. The function would later update with the final result, allowing for a seamless and interactive user experience.


The recent API updates have introduced a specific requirement that a tool call response must directly follow the message where it was initiated. While this is manageable, it disrupts our previous method of providing an early ‘in progress’ message, followed by the actual tool response a few messages later. This approach, which involved issuing a tool message with the final result after non-tool call messages, is no longer viable under the current API constraints.

Invalid parameter: messages with role 'tool' must be a response to a preceding message with 'tool_calls'.

This change means we cannot notify users that their request is in progress until after we have made a tool call, which undermines the interactive nature of our application.


The inability to provide immediate status updates has led to a less responsive user experience. Our users were accustomed to receiving instant feedback, and this change has made our application feel slower and less interactive.

Request for Advice:

We are seeking advice or suggestions on how to adapt to these changes. Specifically:

  1. Is there a recommended approach to provide immediate feedback under the new system?
  2. Are there alternative methods to implement a similar functionality that complies with the new API rules?
  3. Any insights into whether this change is permanent would also be greatly appreciated.

We hope to find a solution that allows us to maintain the level of interactivity and responsiveness our users have come to expect. Any guidance from the OpenAI team or community members who have faced similar challenges would be invaluable.

Thank you for your attention and assistance.


Hey @aurimasniekis! Have you found a solution to this issue yet?

1 Like

I resolved it by just adding

       tool_calls: toolCalls,
       role: "assistant",

before the message with role=“tool”

1 Like

@e.cubillas23 this solves it, but I’m baffled by the flow:

first I’m pushing the tool calls like you suggested (multiple tool calls)
but afterward i need to push a single tool’s response?

            role: 'tool',
            content: responseBack,

either i need to add {assistant : tool call} , and then {tool: reponse} one at a time for all tool calls OR i need to be able to add all tools calls and all responses.
What am I missing here?