Was there an intentional change to the streaming responses? (multiple chunks in stream event)

I started receiving this error late yesterday Uncaught SyntaxError: Unexpected non-whitespace character after JSON at position.

It seems like multiple chunks are being sent in a single stream event without being delimited by newlines.

{"id":"chatcmpl-8le0pJ8DQMuLfBxDDRsPnM0TKmgjd","object":"chat.completion.chunk","created":1706365915,"model":"gpt-3.5-turbo-16k-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}{"id":"chatcmpl-8le0pJ8DQMuLfBxDDRsPnM0TKmgjd","object":"chat.completion.chunk","created":1706365915,"model":"gpt-3.5-turbo-16k-0613","system_fingerprint":null,"choices":[{"index":0,"delta":{"content":"Yes"},"logprobs":null,"finish_reason":null}]}

Zooming in to the issue:

..."finish_reason":null}]}{"id":"chatcmpl-8le0pJ8DQMu...

I was able to patch the issue in the stream parser, but I’m curious whether this was intentional and if I missed the memo or what.

To anyone that needs a patch,

  1. I wrapped the JSON.parse in a try/catch block and
  2. on error am checking for }{.
  3. If it’s present, then I insert a comma between each instance, and
  4. wrap the entire thing in square brackets before running JSON.parse again and
  5. handling the output similar to before, but in a forEach block
5 Likes

I encountered the same problem. Fixed it in the same way.

1 Like

I am experiencing this issue as well. It seems to happen with the GPT 4 turbo preview models but not the GPT 3.5 turbo models. I’m still trying to fix my web interface to account for this. My console shows multiple chunks being returned and concatenated resulting in invalid JSON. I forked the interface I am using and it has since been deprecated so tracking down where to fix it has proved challenging. Any updates from the OpenAI team?

The weird thing, aside from there being no notes about any changes, is that it seems to happen sporadically and not 100% of the time.

1 Like

Have experiencing the same all day. This is the output i get gpt-4-preview:

"Heading Application Head PR Samino:aging Expert and Communication

: journey from compelling content strategic in art auction, hon my to diverse audiences and execute projects This, with knack storytelling building,s with dynamic of of atl.dk I eager bring creative strategic to your of Dan save and time."

1 Like

I was able to fix the issue using the notes provided by the author of this thread. It’s odd I have only seen the malformed chunks with GPT-4.

Can you please provide the actual code that you used to fix this problem as I also having exact same problem since 1-FEB-2024. My code is failing in the const json = JSON.parse(e.data); code line in the following code block:

        try {
            const json = JSON.parse(e.data);
          } catch (error) {
            console.log("2024-FEB-01-error:"); 
            console.log(e.data);             
            return;
          }

Help is greatly appreciated!

It also seemed sporadic when I first noticed the issue.

Here is my code

function parseResponseChunk (buffer: any): OpenAIResponseChunk[] {
  const chunk = buffer.toString().replace('data: ', '').trim();

  if (chunk === '[DONE]') {
    return [{ done: true }];
  }

  // Uncomment for deugging potential chunk issues
  // console.log(chunk);

  try {
    // Directly attempt to parse the chunk as a valid JSON object.
    const parsed = JSON.parse(chunk);
    return [{
      id: parsed.id,
      done: false,
      choices: parsed.choices,
      model: parsed.model
    }];
  } catch (e) {
    // If parsing fails, attempt to handle concatenated JSON objects.
    try {
      // Separate concatenated JSON objects and parse them as an array.
      const modifiedChunk = '[' + chunk.replace(/}\s*{/g, '},{') + ']';
      const parsedArray = JSON.parse(modifiedChunk);
      return parsedArray.map(parsed => ({
        id: parsed.id,
        done: false,
        choices: parsed.choices,
        model: parsed.model
      }));
    } catch (error) {
      console.error('Error parsing modified JSON:', error);
      // Return an indication of an error or an empty array as appropriate.
      return [{ done: true }];
    }
  }
}
2 Likes

Thank you so much. Very much appreciated. It did do the trick for GPT-4 chunks. How can OpenAI just change the API data format! Not very professional!

3 Likes