How to stream response in javascript?

I’m using SSE for streaming, as recommended by the docs

          var source = new SSE(
            "https://api.openai.com/v1/engines/curie/completions/browser_stream",
            {
              headers: {
                "Content-Type": "application/json",
                Authorization:
                  "Bearer API_KEY",
              },
              method: "GET",
              payload: JSON.stringify({
                engine: "curie",
                prompt: text,
                temperature: 0.75,
                top_p: 0.95,
                max_tokens: 3,
                stop: ["\n\n"],
              }),
            }
          )

But my payload params aren’t getting recognized. Anyone able to make this work?

Turns out you have to stuff the payload into the url, ie https://api.openai.com/v1/engines/curie/completions/browser_stream?prompt='Once upon a time'&max_tokens=3

1 Like

Hope this answer your question. If I get your question correctly

1 Like

Thanks for the reply! I tried that but the _stream endpoint only accepts GET requests which can’t contain a body. I think stuffing it into the url is the only way…

It seems like you can use stream: true for the regular POST completion endpoint but use this library:

They actually recommend it in the documentation:

If you’d like to stream results from the POST variant in your browser, consider using the SSE library.

If you look into the code, it seems like it’s listening to the progress event when opening up an XHR instance and then parsing that data into tokens, so I’m guessing setting stream: true in the POST request case is just enabling chunked transfer encoding on the response.

Since streaming responses on a POST request can happen on the server, I would personally opt for that so you don’t have to disclose your private key to anyone on the client (if you plan on shipping what you’re working on to users) vs. using server sent events or any browser implementation.

Ok, definitely lack of knowledge

Thanks for the clarification :pray:. When I try

var source = new SSE(
            "https://api.openai.com/v1/engines/curie/completions",
            {
              headers: {
                "Content-Type": "application/json",
                Authorization:
                  "Bearer API_KEY",
              },
              method: "POST",
              payload: JSON.stringify({
                engine: "curie",
                prompt: text,
                temperature: 0.75,
                top_p: 0.95,
                max_tokens: 3,
                stream: true,
                stop: ["\n\n"],
              }),
            }
          );

          source.addEventListener("message", function (e: any) {
            // Assuming we receive JSON-encoded data payloads:
            var payload = JSON.parse(e.data);
            console.log(payload);
          });

          source.stream();

I get errors. What does a working implementation look like?

1 Like

Are you running this in the browser or on the server? You should only use the sse.js library on the browser. If you’re on the server (in Node), you can use node-fetch and use the res.body stream to get data as it comes:

If you’re using this in the browser, could you record a loom or tell us exactly which errors you’re getting?

1 Like

I’m cool with either - type out 100-200 words or fire off a recording. Whatever you’re more comfortable with is good for me! :stuck_out_tongue:

@m-a.schenk, as you suggested, the code snippets works if I remove the engine param from the payload:

   var source = new SSE(
      "https://api.openai.com/v1/engines/curie/completions",
      {
        headers: {
          "Content-Type": "application/json",
          Authorization: "Bearer " + OPENAI_KEY,
        },
        method: "POST",
        payload: JSON.stringify({
          prompt: selected,
          temperature: 0.75,
          top_p: 0.95,
          max_tokens: 3,
          stream: true,
          stop: ["\n\n"],
        }),
      }
    );

    source.addEventListener("message", function (e: any) {
      // Assuming we receive JSON-encoded data payloads:
      var payload = JSON.parse(e.data);
      console.log(payload);
    });

    source.stream();

@vhmth using fetch also works, but seems to emit all the messages at once rather than separately?

var es = await fetch(
      "https://api.openai.com/v1/engines/davinci/completions",
      {
        headers: {
          "Content-Type": "application/json",
          Authorization: "Bearer " + OPENAI_KEY,
        },
        method: "POST",
        body: JSON.stringify({
          prompt: selected,
          temperature: 0.75,
          top_p: 0.95,
          max_tokens: 10,
          stream: true,
          stop: ["\n\n"],
        }),
      }
    );

    const reader = es.body?.pipeThrough(new TextDecoderStream()).getReader();

    while (true) {
      const res = await reader?.read();
      if (res?.done) break;
      console.log("Received", res?.value);
    }
2 Likes

Is it possible to do it using python?

Here is a python example.

1 Like

Is it just me or when I stream responses from ChatGPT, I am getting malformed JSON occasionally.

{"id":"chatcmpl-71fNyTgERT9GuUKKtkd7p03Wbxomy","object":"chat.completion.chunk","created":1680631770,"model":"gpt-4-0314","choices":[{"delta":{"role":"assistant"},"index":0,"finish_reason":null}]}

 {"id":"chatcmpl-71fNyTgERT9GuUKKtkd7p03Wbxomy","object":"chat.completion.chunk","created":1680631770,"model":"gpt-4-0314","choices":[{"delta":{"content":"Chat"},"index":0,"finish_reason":null}]}

Is this happenning for anyone else?

2 Likes

This looks like 2 separate valid JSON strings.

Indeed sometimes the API will return several JSON in one chunk. You need to split them up. Take a look at https://github.com/openai/openai-node/issues/18 for some examples. That page helped me alot.

1 Like

Yes, I have tried that currently, I have insert a comma in between a pair of two JSONs and adding a bracket at start and end.

@SimonJonsson Actually that was really helpful, as my previous logic had some major edge cases, which broke quite frequently. For anyone else browsing this thread, here’s the link to refer to

1 Like

Does anyone know a good way to protect your OpenAI API key when putting this into a website?

1 Like