I’ve been running into this error a lot using gpt-realtime through pipecat package, is there a cause or reason for it to happen?
A fatal error occurred: ErrorFrame#11(error: The server had an error while processing your request. Sorry about that! Please contact us through our help center at ``help.openai.com`` if the error persists. (include session ID in your message: sess_CpXiXDvntsA7PRZOVS55H). We recommend you retry your request., fatal: True)
This happens right after we get a conversation.item.truncate (takes it down to 0)
Then right after I see AssertionError: New text length 0 is less than last length 726. I think we have some interaction where Truncation is breaking things.
Is it easy to disable sending those truncates in pipecat and see if that makes the bug go away?
I’ve also suddenly started getting the same error when this was working previuosly. Here’s the log:
2026-01-19 15:22:31,805 ERROR OpenAI Response Failed: {‘type’: ‘failed’, ‘error’: {‘type’: ‘server_error’, ‘code’: None, ‘message’: ‘The server had an error while processing your request. Sorry about that! Please contact us through our help center at help.openai.com if the error persists. (include session ID in your message: sess_Czq3XSCTk82H4fjerbryP). We recommend you retry your request.’}}
Limiting the number of allowed_tools options seems to have resolved that bug, but I am now getting a mcp_list_tools.failed with no further logs or details..