Streaming not working on Safari?

The first image is from safari on mobile, chrome does a little better not as bad.
I added the streaming parameter, works fine on chrome on the computer but on mobile its somehow missing the delta chunks. The secod image is from computer on chrome

Wondering what I’m doing wrong?

Front end react code

    const response = await fetch("http://localhost:3080/jung", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        jobValue: jobValue,
        descriptionValue: descriptionValue,
        nameValue: nameValue,
        conversation: chatLogNew,
        emotions: emotion ? [emotion] : [],
        localHour: new Date().getHours(),
      }),
    });

    // Create a new entry for the assistant's response
    const assistantEntry = { role: "assistant", message: "" };
    setChatLog([...chatLogNew, assistantEntry]);

    // Stream the response
    const reader = response.body.getReader();
    const decoder = new TextDecoder("utf-8");
    let assistantMessage = "";

    while (true) {
      const { value, done } = await reader.read();
      if (done) break;

      const chunk = decoder.decode(value);
      const lines = chunk.split("\n");

      for (const line of lines) {
        if (line.startsWith("data: ")) {
          const data = JSON.parse(line.slice(6));
          if (data.message) {
            assistantMessage += data.message;

            // Update the assistant's message in the state
            setChatLog((prevChatLog) => {
              const updatedChatLog = [...prevChatLog];
              updatedChatLog[updatedChatLog.length - 1].message =
                assistantMessage;
              return updatedChatLog;
            });
          }
        }
      }
    }

    const data = JSON.parse(assistantMessage);
    const tokenData = data.usage;

    sendMessageToFirebase(input, data.message, tokenData);
  }

Backend index.js/express

 const response = await openai.chat.completions.create({
    model: "gpt-4-turbo",
    messages: [
      {
        role: "system",
        content: message,
      },
    ],
    temperature: 1.1,
    top_p: 1,
    frequency_penalty: 0.3,
    presence_penalty: 0.5,
    stream: true,
  });

  try {
    for await (const chunk of response) {
      if (chunk.choices[0]?.delta?.content) {
        res.write(
          `data: ${JSON.stringify({
            message: chunk.choices[0].delta.content,
          })}\n\n`
        );
      }
    }
  } catch (error) {
    console.error("Error processing stream:", error);
    res.end;
  }
  res.on("close", () => {
    res.end();
  });
});
2 Likes

I have a similar problem. Have you solved it? I call the fetch function in the background of a Safari extension.

haven’t solved. Safari is the devil

1 Like

Thanks for your reply. I also tried the npm package openai, but neither worked.

try sending just pure text from the backend and see if it still acting weird. if not, then the problem is related to the parsing of text. encoding problem?

I tried fetch in content.js and background.js.
Content is ok, while background is ok at first, and then pending, not data received.

It simply IS the devil. There’s no way around it. God have mercy on ALL of us.

1 Like