RealtimeAPI randomly cuts off the connection

I have an app built on RealtimeAPI Websockets, it worked fine for a while, but recently I started facing a bug with random connection breaks from openai. I tried to debug it, but the only thing I found is that connection randomly cuts off. Sometimes after function call, sometimes out of blue. Has anyone faced the same issue?

1 Like

Hi, we are also using WebSockets. Could you please help us to review our code? Thank you: community.openai(dot)com/t/real-time-api-with-websockets-working-example/1102267

Regarding your issue:
OpenAI experienced some API issues/outages. Could this be the cause?

Disconnects should be expected and handled gracefully with exponential backoff retries, you should always expect an external API call or socket connection to fail or become disconnected and handle that gracefully.

At periods of increased load and during outages error rates can increase and performance becomes degraded, any application making use of an external resource needs to take this into account as a primary part of the code flow.

have same issue since yesterday

We have been encountering the same issue since last week—websocket issues. We update the session instructions based on a function call, it terminates the connection when a function call is triggered by LLM. We’ve decided to migrate to WebRTC, but it isn’t functioning well either. The latest OpenAI releases caused trouble and haven’t been resolved yet.

1 Like

have same issue

I have been experiencing the same issue since last week. Has it been completely resolved?

This was happening to me too specifically after function calls. I was able to fix this by modifying the payload I send on the conversation.item.create event that gets passed back to the model with the function response. Previously it required:

{
    "type":"conversation.item.create",
    "item": {
        "type": "function_call_output",
        "call_id": "the_call_id",
        "output": {
            "function_name":  "the_function_name", 
            "function_response": "some function string response"
            }
    }
}

But now it requires and works with:

{
    "type":"conversation.item.create",
    "item": {
        "type": "function_call_output",
        "call_id": "the_call_id",
        "output": "just the function string response"
    }
}

So I believe the changed format of the output parameter caused it to fail. At least that fixed it for me, hope it helps!
Edit: Docs link