I’m working on integrating OpenAI’s GPT-4 into a node.js application using the official OpenAI npm package. My application includes an assistant that can retrieve a list of countries using a vector database. I’m currently able to create a thread, add messages, and start a streaming run. However, I’m facing issues with handling responses after submitting tool outputs.
Here are the relevant steps and code snippets:
-
Create a thread:
const thread = await openai.beta.threads.create();
-
Add a Message:
await openai.beta.threads.messages.create(req.session.threadId, {
role: "user",
content: req.body.message,
});
- Start a Streaming Run:
const stream = await openai.beta.threads.runs.create(req.session.threadId, {
assistant_id: assistantId,
stream: true,
});
- Handle Tool Output Submission:
for await (const event of stream) {
if (event.event === "thread.run.requires_action") {
const toolCalls = event.data.required_action.submit_tool_outputs.tool_calls;
for (const toolCall of toolCalls) {
if (toolCall.function.name === "searchCountries") {
await openai.beta.threads.runs.submitToolOutputs(
event.data.thread_id,
runId,
"Australia, Japan, China", // result is actually calculated, hard coded for brevity
);
}
}
}
}
My question is: What should I do after executing await openai.beta.threads.runs.submitToolOutputs?
Specifically, I want to know if there’s another event that I should listen for to obtain and send back the results of the run to the frontend. Previously, I handled this with the following code, but it seems outdated according to the latest documentation:
.on("messageDone", async (event) => {
if (event.role === "assistant" && event.content) {
res.json({ aiResponse: event.content[0].text }); // sends message back to the front end
} else {
res.status(500).send("No valid response from the AI.");
}
})
I can’t find the equivalent of messageDone or understand how to properly finalize and send the response once the run is complete using the new API structure. Any guidance on how to handle this with the current OpenAI API would be greatly appreciated.