Streaming gpt-5 responses API - missing delta chunks

Hello, i have problem with streaming with gpt-5. When streaming, i dont get back delta chunks. i got only response.created, then response.in_progress, response.output_item.added, then its long pause and response.output_item.done…

So i dont receive any delta chunks and last done chunk dont have text content.

Do someone have same problem?

You may have a problem if you are under-specifying the max_completion_tokens (max_output_tokens on Responses).

That parameter sets your maximum spending budget.

This model has internal reasoning that counts as generation, and on Responses also can iterate internally on tools and keep generating new reasoning. When you specify a maximum tokens, the model may never transition to producing your response, instead being cut off while still reasoning.

I found out where is problem. Maybe this will help somebody..

GPT-4o and older models without reasoning sending chunk response.output_item.done at the end. So it is ending stream (kind of)… but GPT-5 is sending chunk response.output_item.done after reasoning ends and after that it is sending Deltas and then again response.output_item.done