Streaming assistant responses

Streaming assistant responses is something I’ve been wanting for a while. I’ve tried polling messages but they don’t have any content until the run is completed, so that wont work for streaming. I noticed on the dev day demo that the assistant they demo’d streamed its responses. The chap presenting the demo shows some code at 35 minutes in where he appears to be receiving server sent events from the thread runs api (fetchEventSource), unfortunately the headers and body of the post request is cut off. Has any one worked out how to get this endpoint to support streaming? When I make a post request to run a thread with stream: true in the body I get an "Unknown parameter: 'stream'." error back. Also bumping the OpenAI-Beta header version doesn’t work either. Any one else looked at getting this working? Anyone had success in doing so?

1 Like

Right now strearming is explicitly not supported .

This video and his next video are helpful for streaming: https://www.youtube.com/watch?v=F-KRs6vg4mM

Thanks, I skimmed through the video but it doesn’t appear to stream messages from the assistant, it waits until the thread run is completed before retrieving the full messages. I was looking for something like the stream functionality the chat completion endpoint provides, but from the assistants endpoint

Yeah I know. I was just hoping, given it was in the dev day demo, it was implicitly supported!