Most of the time function calls with o4-mini and the responses API works great. Occasionally, I get the following error:
“Item ‘rs_684308adb3c4819bba96ef6883a4d9e7059dff5718d7b7a6’ of type ‘reasoning’ was provided without its required following item.”
After being very confused, I looked at the logs in the OpenAI dashboard. Everything looks correct at first glance, but looking at the network requests I see this in the details:
"max_output_tokens": null,
"model": "o4-mini-2025-04-16",
"next_response_ids": [],
"output": [
{
"id": "rs_684308adb3c4819bba96ef6883a4d9e7059dff5718d7b7a6",
"type": "reasoning",
"summary": []
},
{
"id": "rs_684308b5cf04819bac3741bf179c1bce059dff5718d7b7a6",
"type": "reasoning",
"summary": []
},
{
"id": "fc_684308b60908819b9cd091c4603191f6059dff5718d7b7a6",
"type": "function_call",
"status": "completed",
// redacted
}
],
It really looks like the Responses API thinks that there are two reasoning outputs in a row. I only see this on the logs that are breaking.
I don’t think this is related to the client code at all. I don’t even know how I would be able to add or remove those outputs to “fix” the problem. I think it needs to be fixed on the OpenAI api server.