I use the stream api of gpt4, and I found the response seems return in terms of chunk instead of word/token. Is there any way to fix this?
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
In GPT4 streamed responses all chunks come in a single batch | 4 | 1944 | June 7, 2024 | |
GPT-4 model, unexpected returns in stream mode | 10 | 3091 | December 16, 2023 | |
GPT4o Hangs after first chunk | 8 | 571 | June 4, 2024 | |
Streaming Response Keeps on Breaking | 4 | 532 | July 3, 2024 | |
GPT4 Streaming doesn't use all information from retrieved document | 2 | 920 | September 8, 2023 |