Chatgpt turbo streaming always ends in error

I’m trying to use the new chat completion endpoint with streaming set to true. It seems to work, but I’m always getting an error as the last chunk in the form of:

{
  error: 'The server had an error processing your request. Sorry about that! You can retry your request, or contact us through our help center at help.openai.com if you keep seeing this error. (Please include the request ID c96323aaed1362e77ec7c246cdb08ed0 in your email.)'
}

Is this a known issue?

Yes, in all cases there is some error on the client side how they created their chat completion code. The issue is always a mistake on the user / client side, not on the OpenAI API server-side

Please post your code so we can help you debug it.

Thanks

:slight_smile:

Well, the chat completion does actually stream back, but the error only occurs in the last chunk. I’m doing a POST to:

https://api.openai.com/v1/chat/completions

with the params:

{
  "max_tokens": 200,
  "temperature": 0.75,
  "n": 4,
  "stream": true,
  "user": "testing",
  "messages": [
    {
      "role": "system",
      "content": "You are a masterful fiction writer that will continue the story in a compelling way"
    },
    {
      "role": "user",
      "content": "She powers on the microscopic chip embedded in the Wernicke’s area behind her ear lobes, that tiny sliver of brain responsible for speech processing. She subvocalizes her password. The chip captures the minute electrical signals sent to the larynx and emits her ID to the door. The door snicks open and the Voice utters a platitude she ignores."
    }
  ],
  "model": "gpt-3.5-turbo-0301"
}

Well done. Then you are doing well, @jamesyu .

Let’s bring in @raymonddavey to help you as he does a lot of streaming and has been helping members with their chat completion streaming issues.

:slight_smile:

The only thing about the last block that is different is that it doesn’t have a chunk. It just has a string with the text [DONE] after the data tag.

The last chunk is always coming back as:

data: {“error”:{“message”:“The server had an error processing your request. Sorry about that! You can retry your request, or contact us through our help center at help.openai.com if you keep seeing this error. (Please include the request ID fb94b183e8b7bc5aca22fa62981cc107 in your email.)”,“type”:“server_error”,“param”:null,“code”:null}}

1 Like

I’m receiving the following error sometimes in the middle of a stream:

stream error: stream ID 13; INTERNAL_ERROR; received from peer

Is it possible to recovers from this error?

1 Like