ChatGPT stream API response has chunks messed up

I am using quite simple library to work with OpenAI API, and here is the listing of what it does in streaming mode:

  def json_post(path:, parameters:)
      to_json( path)) do |req|
        if parameters[:stream].respond_to?(:call)
          req.options.on_data = to_json_stream(user_proc: parameters[:stream])
          parameters[:stream] = true # Necessary to tell OpenAI to stream.
        elsif parameters[:stream]
          raise ArgumentError, "The stream parameter must be a Proc or have a #call method"

        req.headers = headers
        req.body = parameters.to_json

  def to_json_stream(user_proc:)
      proc do |chunk, _|
        chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|

        rescue JSON::ParserError
          # Ignore invalid JSON.

So basically just extracting chunks of response and passing it into block that users passed to api.
However, when I put chunks into string one-by-one I see they are messed up: order is wrong, some of them are missing etc.
Anyone facing the same issue? Any options to mitigate?

1 Like

Chunks are messed up when using alternative host: oai_hconeai_com.
Request without host customisation return correct response in stream mode.

1 Like