So i’m building an app (react in the front + express.js server ) with the Assistant API. when the conversation starts and the user says hey, i use function calling to get user data in order to customize the greeting message.
Before introducing the function calling feature everything worked seamlessly. now that i’m trying to integrate it , i’m having trouble. the assistant run gets stuck and doesn’t stream the response after getting tool outputs. ( yes i’ve read the docs back to back ) . here’s my simple code , for the simplicity puporses i trimmed the unecessary parts
Client ( to process the event received from my backend )
const response = await fetch(
`${process.env.REACT_PUBLIC_BACKEND_API}/virtual-coach/chat`,
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
message: text,
userId: 110,
threadId,
}),
signal: abortController.signal,
}
);
while (true) {
const { value, done } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
const lines = chunk.split("\n\n");
for (const line of lines) {
if (line.startsWith("data: ")) {
try {
const eventData = JSON.parse(line.slice(6));
console.log("eventData", eventData);
switch (eventData.type) {
case "textDelta":
if (eventData.delta && eventData.delta.value) {
// handling textDela logic
break;
case "toolCallCreated":
console.log("toolCallCreated", eventData);
// Handle tool calls if
break;
case "end":
console.log("end", eventData);
break;
}
} catch (jsonError) {
console.error("Error parsing JSON:", jsonError);
}
}
}
}
Now my express router
router.post("/chat", async (req, res) => {
res.writeHead(200, {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive",
});
try {
await openai.beta.threads.messages.create(threadId, {
role: "user",
content: message,
});
const stream = openai.beta.threads.runs.stream(threadId, {
assistant_id: ASSISTANT_ID,
});
const sendStreamData = (type, content) => {
res.write(`data: ${JSON.stringify({ type, content })}\n\n`);
};
stream.on("runStepCreated", (runStep) => {
if (!currentRunId) {
currentRunId = runStep.run_id;
}
});
stream.on("textCreated", (text) => {
res.write(
`data: ${JSON.stringify({ type: "textCreated", content: text })}\n\n`
);
});
stream.on("messageCreated", (message) => {
res.write(
`data: ${JSON.stringify({
type: "messageCreated",
content: message,
})}\n\n`
);
});
stream.on("textDelta", (delta, snapshot) => {
res.write(
`data: ${JSON.stringify({ type: "textDelta", delta, snapshot })}\n\n`
);
});
stream.on("toolCallCreated", async (toolCall) => {});
stream.on("toolCallDelta", (delta, snapshot) => {});
stream.on("toolCallDone", async (toolCall) => {});
stream.on("event", async (event) => {
// this didn't work
if (event.event === "thread.run.requires_action") {
const toolsResults = await handleToolCalls(
event.data.required_action.submit_tool_outputs.tool_calls,
sendStreamData,
userId
);
const outputStream =
await openai.beta.threads.runs.submitToolOutputsStream(
threadId,
currentRunId,
{ tool_outputs: toolsResults }
);
for await (const event of outputStream) {
if (event.event === "thread.message.delta") {
res.write(
`data: ${JSON.stringify({
type: "textDelta",
content: event.data.delta.content[0]?.text?.value,
})}\n\n`
);
}
}
}
});
stream.on("end", async () => {
res.end();
});
} catch (error) {
// Error handling not implemented in the original code
}
});
The problem is that the toolCallDone gets triggered, then instantly the stream end , so despite calling
const toolsResults = await handleToolCalls(
event.data.required_action.submit_tool_outputs.tool_calls,
sendStreamData,
userId
);
const outputStream =
await openai.beta.threads.runs.submitToolOutputsStream(
threadId,
currentRunId,
{ tool_outputs: toolsResults }
);
to stream responses , nothing seems to work !
What i tried to do , is as soon as i receive in my client the toolCallDone event , i hit an api to append output tool calls to the run, but i don’t like this approach since it means another round trip to the server.
Is there a way to make the run continues its flow as soon as i receive tool call output ? the stream is being stuck currently in required_action step !!