Openai stream reponses too slow

Hello,

I’m encountering an issue where the response time for my OpenAI API calls is extremely slow. It takes around 30-40 seconds to complete a small response. Additionally, I occasionally encounter unknown errors when using the stream feature. I am using the gpt-3.5-turbo-16k-0613 model.

Here’s the error message I receive:
“openai stream error: The server had an error while processing your request. Sorry about that! (Error occurred while streaming.)”

While occasional unknown errors can be tolerable, the consistently slow response is quite frustrating. Interestingly, when I tried another OpenAI API key, the response was much faster. Can you please advise on how I can resolve this?

Thank you.

Hi and welcome to the Developer Forum!

Sorry to hear you are having an issue with your API calls, changes in performance over time is quite common while more and more people start using the services and the new hardware to support them gets installed.

It is always a balancing act between allowing new users to access to models and the performance the models provide. If you wish to report your issue to OpenAI you should do so via the help.openai.com website and use the support bot in the bottom right corner.

Adding onto this, a LOT of people have been complaining about this same error since Friday I believe.

1 Like

OpenAi just released this announcement,

Update:
Screenshot 2023-10-16 at 12.12.40 PM