I currently have a backend written in python flask that makes use of a redisqueue worker to run openai requests in the background, then return the result. Does anyone know if it’s possible to have a worker stream the intermediate text in a way that can be received by a react typescript frontend? Is there a better solution for this?
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
For those who've built a GPT4 chatbot with streaming ... how? Webhooks vs. Server-Sent Event? | 3 | 1100 | January 30, 2024 | |
Reciving data sequentially from gpt | 2 | 1045 | December 23, 2023 | |
Stream responses with Langchain and ChatGPT | 1 | 1212 | December 23, 2023 | |
The frontend in the production does not show a streaming response, though the frontend in development works well | 0 | 732 | December 14, 2023 | |
Stream responses in next js without open ai package | 1 | 586 | March 18, 2024 |