Gpt-4 createChatCompletion stream response format and model switch

I’m using the Node SDK version 3.2.1 running createChatCompletion with gpt-4

Here my settings:

{
          temperature: 0.5,
          max_tokens: 200,
          top_p: 1,
          frequency_penalty: 0,
          presence_penalty: 0,
          user: uid,
          model: "gpt-4",
          stream: true,
          messages: [
            {
              role: "system",
              content: systemMessage,
            },
            { role: "user", content: userMessage},
          ],
        }

The API responds but:

  1. Replaces gpt-4 with gpt-4-0314
  2. Delivers a different response format than is documented. Specifically instead of choices[0]message.content it responds with choices[0].delta.content.

Response of one stream/message:

{
          id: "chatcmpl-6wjCfqoSnIjjqUjmdvwKB7VH6Rmp5",
          object: "chat.completion.chunk",
          created: 1679454805,
          model: "gpt-4-0314",
          choices: [{ delta: { content: " for" }, index: 0, finish_reason: null }],
}

I’m unclear if this is a bug or the standard response format to a when stream: true. As per docu, when stream is set to true it responds with server-sent events, but I’m unclear about the differences in the choices array.

Appreciate any pointers or guidance on this.Thanks!

FWIW, I have been bashing my head against the wall for days now. If I find an answer I will be Sur to let you know.

Hey, any update for this? I’ve tried to implement streaming and have not had any success. Either I’m getting JSONDecode errors or the response content value comes through as empty. 2 straight days of trying to debug this and I’m about to say screw it and just go back to no streaming.

  1. Replacing the model is normal, gpt-4 is alias for latest model, which is gpt-4-0314
  2. That is the normal response for streaming. See this issue with example code on parsing it

Does anyone have an exaple for lua im implementing gpt-4 to roblox i need the streaming feature