OpenAI API Error "function_call was provided without its required reasoning item" - The Real Issue

Hey everyone, wanted to share a tricky issue we just solved that might help
others.

The Problem:
After upgrading from o3 to gpt-5, our multi-hop tool calling
started failing with this error:
Error code: 400 - {‘error’: {‘message’: “Item ‘fc_68b2d…’ of type
‘function_call’ was provided without its required ‘reasoning’ item:
‘rs_68b2d…’.”}}

What Made It Confusing:

  • o3 rarely made parallel tool calls, so we never hit this issue
  • gpt-5 is more agentic and often makes 5-10 parallel calls in one hop
  • The error message suggests you’re missing reasoning items, but that wasn’t
    the actual problem

What We Were Doing (Wrong):
In hop 1, OpenAI sends: reasoning, call1, call2, call3

We were building our chain for hop 2 like:
reasoning
call1
output1
call2
output2
call3
output3

The Fix:
OpenAI expects calls and outputs to be grouped separately:
reasoning
call1
call2
call3
output1
output2
output3

Basically, append all function_calls to the chain as you receive them during
streaming, then append all outputs after execution - don’t interleave them.

Hope this saves someone the hours of debugging we went through. The error
message really should say something about chain structure rather than “missing
reasoning item” :man_shrugging:

3 Likes

if you ever make it to Valtellina (northern Italy), count on a traditional dinner on me. Thanks so much for the tip!

1 Like

Hi Claudio! When I solved this I felt like “someone else gonna be troubled, need to make a post” so I posted this although my friends told me that it was waste of time. I’m very glad it was helpful to you. Actually, im planning to be in Italy during next 1-2 years, so pls ping me at Michaelk.juiceapp@gmail.com, I’ll be glad to meet and discuss AI stuff (and have an Italian dinner ofc) :slight_smile:

1 Like

I sent you an email! Thanks again.

ps: I think your post helped a lot of people; they just didn’t comment

1 Like

Hi Claudio! I got your email and responded yesterday, any chance it went into spam folder? Thank you for the kind words

@SpecialFx tag your here so you don’t miss my message above (just in case)

Thanks for sharing that first — I’ve got a couple of questions.
When you say we need to input them in that order when sending to the model, do you mean when passing them through the item_reference parameter?

And my second question is:
What should we do if there are multiple reasoning items?

Thanks! For context, I run a stateless setup — I don’t store history on OpenAI. I build the next hop myself and send the full items (not relying on item_reference). If you do use item_reference, the same ordering rule applies.

On multiple reasoning items: keep them in the exact order you received them. In my case it’s usually just one.