How do you use OpenAIStream in Assistant Threads

I have a simple nextjs app to create and use threads. I can’t get the content of the response to stream back to the client. Is OpenAIStream useable in threads?

import OpenAI from "openai";
import { NextResponse } from "next/server";
import config from "./../../config";
import { OpenAIStream, StreamingTextResponse } from "ai";

const openai = new OpenAI({
  apiKey: config.openAIKey,

export async function POST(request: Request, response: Response) {
  // Create a thread with a message.
  const thread = await openai.beta.threads.create({
    messages: [
        role: "user",
        content: "What are the best Software engineering patterns to use today?",

  // Create a run for the thread.
  const run = await openai.beta.threads.runs.create(, {
    assistant_id: config.openAIAssistantId,

  // Wait for the run to complete.
  let runStatus = "in_progress";
  while (runStatus === "in_progress") {
    await new Promise((resolve) => setTimeout(resolve, 1000));
    console.log("waiting for run to complete");
    const run_response = await openai.beta.threads.runs.retrieve(,
    runStatus = run_response.status;

  //Display the Assistant's Response
  const messages: any = await openai.beta.threads.messages.list(;
  // need to format this for vercel useChat hook
  return NextResponse.json({ status: "ok" });
  // const stream = await OpenAIStream(messages);
  // return new StreamingTextResponse(stream);

There’s a couple things to note:

while (runStatus === "in_progress")

The status can and usually start with "queued". You need to consider this in your code as you can potentially bypass your while loop without the job being completed.


  let runStatus = "in_progress";
  while (runStatus === "in_progress") {

Is a hacky solution. If the run is still queued after 1 second you are screwed. This is a race condition and is never good practice.

Lastly streaming is not available yet for Assistants.

1 Like

Thanks for that. It’s a good call.
I didn’t see the state transitions mentioned in the docs.