Stream response from `/v1/chat/completions` endpoint is missing the first token

This issue is resolved. It had to do with the decoded string value I was getting back. The first chunk received does have the metadata, but also includes the first token. Without performing the decoded.split() command, the first chunk looks like this:

data: {"id":"chatcmpl-7CUO9LG02qY3GbqIwH6buRz69jUrZ","object":"chat.completion.chunk","created":1683211105,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}

data: {"id":"chatcmpl-7CUO9LG02qY3GbqIwH6buRz69jUrZ","object":"chat.completion.chunk","created":1683211105,"model":"gpt-3.5-turbo-0301","choices":[{"delta":{"content":"Dear"},"index":0,"finish_reason":null}]}

That’s all one chunk, so I pivoted to instead of splitting on the data : string, to regex matching the “content” value like this:

let { groups: { newToken } } = decoded.match(/data:\s*{.*?"content":"(?<newToken>.*?)".*?}/s)

It’s still a little janky, but it does solve my issue.

4 Likes