Any updates on Assistant API Streaming?

Building a web app using assistants api. Lack of streaming is seriously hurting the UI and making me consider just going another route until streaming is available.

Has anyone heard anything about when streaming is expected?

30 Likes

Streaming is already available in Assistant API; albeit hidden. Messages under threads is essentially streaming. You just have to consume ascending; while marking the message as being consumed ; when consumed.

In the upcoming (hopefully soon) library, you can mark the message as processed as soon as a streaming engine consumes that message.

1 Like

This workaround wouldn’t stream in real time as the answer is being written like how ChatGpt and Perplexity do, right?

Have you found any solution for the same ? I am facing the same issue

Nope not yet. I’m just going to go save this code and go with a different model for now. Once streaming is out, I’ll start using Assistants again

4 Likes

Looking forward to this feature too. Even long polling or webhooks would be beneficial, it’s wasteful to have to make the same request over and over until the status updates.

3 Likes

Could you please elaborate on what you mean or point me in the direction of any relevant documentation ( I can’t see anything in the api docs). When I poll a thread that is "in_progress" I can retrieve messages, however these messages don’t have any text content until the thread is "completed". I’m unsure as to how to mark the message as consumed via the api. Any help is much appreciated, thank you!

Please see the “Right Semantics for Assistant API” published as main topic

No it’s not, it’s a list of Message objects, and like @rjlynchdev mentioned the latest message content isn’t being populated until the run status turns into completed. It’s very basic RESTful style APIs.

In order to stream the new message OpenAI is likely going to need to add { stream: true } support to their GET /threads/{thread_id}/messages/{message_id} API.

3 Likes

I echo your question, streaming is essential to make assistants user friendly. Waiting for the full response to be populated takes way too long of time which impacts the value significantly to the end user

3 Likes

Any news on when streaming will be supported? I really don’t want to have to hack together a manual version of assistants!

1 Like

Bear in mind that the assistants have a few other issues that have been logged during the Beta, but so far no fixes have been provided or even an ETA.

If you’re looking for production-ready solutions, it may be recommended to build on a different stack until the assistants come out of Beta. You can build a simple stack with Chroma and a GPT model quite quickly if you’re looking to use their retrieval function. If you’re looking for functions, that’s even easier, just need to build the tools into the prompts and create the same code for functions as you would’ve on the Assistant.

Best of luck!

Would you kindly provide more details? I did not get your meaning.

Any updates on Streaming for the Assistant API?

8 Likes

Streaming is essential. In addition to the Assistants API, invoking functions in parallel through the legacy chatCompletions route also encounters streaming issues when integrating into a standard chatbot scenario. Therefore, I am eagerly anticipating an update (which was promised a few days after the initial beta release).

5 Likes

This is a very important issue.

When working doing contract work, many (contractual) business requirements are expressed in terms of user experience and “responsiveness”. Most clients are going to be dissatisfied with a chat bot which takes 15-20 seconds to output anything which makes it risky to use this API.

Looking forward for streaming once it does get implemented :slight_smile:

4 Likes

Please prioritize this feature OpenAI :grinning:.

10 Likes

Waiting for this as well… Really want to implement this on our help section as soon as possible.

2 Likes

Agreed. I’ve switched to different methods until they prioritize this. Can’t ship this in customer-facing projects without streaming in this day and age (how time flies!)

2 Likes

Yes - please support streaming ASAP! Thanks!

3 Likes