Read Stream Response in Chunks using NodeJS with Function Calling enabled

Hi,

Any ideas how to do this in nodejs? I need to use function calling with stream enabled in a NextJS app.

I saw this code in the cookbook…

Python code

response = openai.ChatCompletion.create(
    model='gpt-3.5-turbo',
    messages=[
        {'role': 'user', 'content': "What's 1+1? Answer in one word."}
    ],
    temperature=0,
    stream=True ,
    functions= ...,
)

for chunk in response:
    print(chunk)
3 Likes

This is the NodeJS example from the git docs, looks like it might be possible, not tried it as yet, interested to see.

import OpenAI from 'openai';

const openai = new OpenAI();

async function main() {
  const stream = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [{ role: 'user', content: 'Say this is a test' }],
    stream: true,
  });
  for await (const part of stream) {
    process.stdout.write(part.choices[0]?.delta?.content || '');
  }
}

main();

Found here

3 Likes