API calls taking 2+ minutes vs 15 seconds previously?

We’ve run into recent issues with the response from the Chat GPT-4 and gpt 3.5 turbo. We initially had response times of around 10 to 15 seconds with gpt 3.5 turbo. Suddenly we found that the responses went up to 1 minute, then to 2. And now it’s consistently at 2 minutes. No changes on our end for how we are calling the API.
We have switched over to GPT-4, but are experiencing similar times.
Did something change with how the API is being prioritized and processed?
It’s killing our ability to create good reports for our users, and we are now evaluating other AI tools instead for faster response times.
Any help would be appreciated.

1 Like

Yes, OpenAI is toying with accounts using gpt-3.5-turbo and messing with other models and endpoint code to see how much they can annoy API developers.

1 Like

got the same thisng since they introduced GPT vision. on personal account response time is 15 sec to 30, in business account its 90 sec to more.

1 Like