Based on the Python SDK documentations, I managed to get a streaming example in Streamlit.
Hope this helps!
Demo
Code
Based on the Python SDK documentations, I managed to get a streaming example in Streamlit.
Hope this helps!
Demo
Code
This is cool. Thanks for sharing. The app only allows one question though
I’ve managed to get this up with streamlit’s chat interface to allow multiple questions.
Will share soon
Great. Thanks for the update. I look forward to it
Added it here.
To keep within streamlit’s chat interface, we also need to store the chat history locally via the session_state
to render it.
stream = client.beta.threads.runs.create(
thread_id=st.session_state.thread_id,
assistant_id=ASSISTANT_ID,
tool_choice={"type": "code_interpreter"},
stream=True
)
then for each chunk in stream
, we need to check the event type and store/render it differently.
Another simpler implementation can be found here.
This is exactly what I am looking for in my project (a demo of LLM UI) ! Thanks for sharing!