ChatGPT stream API response has chunks messed up

Hello
I am using quite simple library to work with OpenAI API, and here is the listing of what it does in streaming mode:

  def json_post(path:, parameters:)
      to_json(conn.post(uri(path: path)) do |req|
        if parameters[:stream].respond_to?(:call)
          req.options.on_data = to_json_stream(user_proc: parameters[:stream])
          parameters[:stream] = true # Necessary to tell OpenAI to stream.
        elsif parameters[:stream]
          raise ArgumentError, "The stream parameter must be a Proc or have a #call method"
        end

        req.headers = headers
        req.body = parameters.to_json
      end&.body)
    end

  def to_json_stream(user_proc:)
      proc do |chunk, _|
        chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
          user_proc.call(JSON.parse(data))
        rescue JSON::ParserError
          # Ignore invalid JSON.
        end
      end
    end

So basically just extracting chunks of response and passing it into block that users passed to api.
However, when I put chunks into string one-by-one I see they are messed up: order is wrong, some of them are missing etc.
Anyone facing the same issue? Any options to mitigate?

1 Like

Chunks are messed up when using alternative host: oai_hconeai_com.
Request without host customisation return correct response in stream mode.

1 Like

But how to prevent this? I got the same issue, which chunks not in order were returned.