I want to parse a streamed chat completion response. The response will be a JSON array.
I’m using the new node API Library v4.0.1. The response is a JSON array, and I need to parse individual elements as they are returned. I tried using a node library for stream parsing - stream-json - but the types seem incompatible.
Has anyone been able to achieve something similar?
The response I am generating is fairly big (10-15 question answers). Wanted to cut the time it takes to show something to the user.
I have written custom code to achieve it. Wanted to see if there was a way to use an existing library like stream-json
My Custom Code:
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY // This is also the default, can be omitted
});
const stream = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
stream: true,
messages: [{ "role": "system", "content": "You are a helpful assistant." }, { role: "user", content: "Return an array of 5 JSON objects. Each object contains two keys - head and body. Values are random words. Return only the JSON array. Do not include any additional commentary in the response." }],
});
let data = ''; // To accumulate the chunks of response data
for await (const part of stream) {
const chunk = part.choices[0].delta.content || "";
data += chunk // accumulate
const endIndex = data.indexOf('}');
if (endIndex !== -1) {
const startIndex = data.indexOf('{');
const jsonObject = data.slice(startIndex, endIndex + 1); // Extract the JSON object
data = data.slice(endIndex + 1); // Remove the extracted JSON object from the accumulated data
try {
const parsedObject = JSON.parse(jsonObject);
console.log(parsedObject); // Handle the parsed JSON object here
res.write(jsonObject);
// Make an API call
} catch (err) {
console.error('Error while parsing JSON:', err);
}
}
}