Repeating response content when using streamed chat API

UPDATE: issue was with nginx server config. Both the OpenAI API and the php client library (openai-php/client) are working fine.

I just tested my streaming chat system with the gpt-3.5 endpoint and it seems to be functioning correctly, I can only think it’s an issue with the calling code?

And what that issue with the “calling code” might be?
As you can see the code is extremely simple.

I am not familiar with a library made by OpenAI for PHP, so I am assuming that library is from a 3rd party and may contain errors.

Where did you get it from?

It’s on Github: openai-php/client
@contractorwolf was using nodejs client. The issue might not be due to client library.

I understand, but I am using the OpenAI python library and I am streaming chat without issues, this means that either :
a) the server instance I am connected to does not have the issue that the server instance you are connected to does.
or
b) The library you are using has an issue.

Is that a fair assessment?

It’s not an issue with the API or the client library.
The issue is with nginx server.

So you narrowed it down? Can you share your solution, if you have one?

I’m getting the same with 4.0.1 in typescript/javascript. I’m trying to use model gpt-3.5-turbo-16k with a fairly large payload of messages doing a code refactor (I’m waiting for my chatgpt 4 to be available after payment – won’t let me do this on 4 atm).

When I use stream: false, I get a timeout

When I use stream: true, I get a few tokens correctly, then the same token repeated

Code is simply:

      const response = await openai.chat.completions.create({
        messages,
        ...opts,
        stream: true,
      })
      for await (const part of response) {
        const chunkFile = `${outFolder}/chunks/${part.id}.json`
        if (exists(chunkFile)) {
          console.log(`Skipping ${chunkFile}`)
          continue
        }
        writeToFile(chunkFile, JSON.stringify(part, null, 4))
        const chunk = part.choices[0]?.delta?.content || ""
        content += chunk
        writeToFile(`${outFolder}/output.ts`, content)
        reason = part.choices[0]?.finish_reason || ""
      }

NOTE: I think I tracked my issue down to token issues – if I didn’t allocate max_tokens properly, this condition happens