… while the run is still “in_progress”
I’m using new streaming Assistant API on Node.js like this:
const stream = await openai.beta.threads.runs.create(
threadId,
{
assistant_id: process.env["OPENAI_ASSISTANT_ID"],
stream: true,
},
);
for await (const event of stream) {
...
}
After I get event.event === "thread.run.requires_action"
and call submitToolOutputs
the client receives [END]
and stream
no longer emits new events, even though the run is still in_progress
(if I immediately call openai.beta.threads.runs.retrieve
I can see it). I’m not sure how to continue streaming the output.
I was wondering the same and looked here first. Then I went to the source code.
There is a submitToolOutputsStream
method!
I’m using it like this when I get to a required actions
state, after handling tool outputs:
const toolRun = openai.beta.threads.runs.submitToolOutputsStream(
thread_id,
run_id, {
tool_outputs,
stream: true,
},
streamOptions);
for await (const event of toolRun) {
await handleEvent(event);
}
2 Likes
Glad it’s not just me; i’ve noticed this too.
I think it’s a bug, because according to the docs for the ‘Create run’ method, ‘stream’ parameter:
returns a stream of events that happen during the Run as server-sent events, terminating when the Run enters a terminal state
https://platform.openai.com/docs/api-reference/runs/createRun#runs-createrun-stream
Since the run isn’t in terminal state, the stream should continue.
1 Like
I’m also dealing with this in implementation. So far, I’m finishing the stream with a function that polls the /threads/{thread_id}/runs endpoint until the status is not “queued” or “in_progress”.
I am just seeing in the docs that there’s a stream paramter on the submitToolOutputs endpoint, so I assume we’re supposed to continue our other original stream from there. I’ll try to report back with whether or not that works.