I’m encountering an issue where the response time for my OpenAI API calls is extremely slow. It takes around 30-40 seconds to complete a small response. Additionally, I occasionally encounter unknown errors when using the stream feature. I am using the gpt-3.5-turbo-16k-0613 model.
Here’s the error message I receive:
“openai stream error: The server had an error while processing your request. Sorry about that! (Error occurred while streaming.)”
While occasional unknown errors can be tolerable, the consistently slow response is quite frustrating. Interestingly, when I tried another OpenAI API key, the response was much faster. Can you please advise on how I can resolve this?