Stream OpenAI API from nextJS

Hello,

I’m facing an issue to read data chunk by chunk. I’m using a Kotlin backend (who do the actual call to OpenAI) and nextJS front-end.

In my Kotlin, I return the chunks this way:

call.respondTextWriter(ContentType.Text.Plain, HttpStatusCode.OK) {
                    streamFlowable
                        .collect{ chunk ->
                            log.trace("$chunk")
                            write(chunk)
                            flush()
                        }
                }

The log.trace display each chunks so this part works well.

Then in my client component, I do a direct call to this route:

const response = await fetch(`/api/streamquestion/${id}`, {
                method: 'POST',
                headers: {
                    'Content-Type': 'application/json'
                },
                body: JSON.stringify(question)
            });
            
            const textDecoder = new TextDecoder();

            let answer = ""
            if (response.body != null) {
                const reader = response.body.getReader()
                
                while (true) {
                    console.log("log");
                    const { value, done } = await reader.read();
                    if (done) {
                        break;
                    }
                    const decodedString = textDecoder.decode(value, { stream: true });
                    answer += decodedString;
                    console.log("Chunk received:", decodedString);
                }
                console.log("Full answer:", answer);
            }

the URL in fetch is redirected to the correct route. in my next.config.js

As I said my logs in the backend show me that the data is sent chunk by chunk. In my front , the console.log(“log”) is triggered only 1 time and then the console.log(“Chunk received:”, decodedString); display the full answer with all chunk when the call is finally done.

Does anyone has an idea ?

When I put this exact code in a server component, the stream works well but knowing it’s stateless, I can’t stream it to the client

Additional information:
I followed this guide tutorial https://www.youtube.com/watch?v=5rhvyHvocaA
Here his github GitHub - Adam-Thometz/OpenAI-Stream-Experiment

The block of code where he perform the stream is this one


You can find it in models/Chat.js

Did you confirm the fetch() call to OpenAI includes “stream: true” in the body?