GPT 3.5 Turbo is very slow lately

Anyone else experiencing slow GPT 3.5 Turbo API responses.
Our scripts have the timeout set at 120 seconds and the API is not able to return a few hundred tokens in this interval.
I noticed that this slowness started a few days ago.


Same thing here. It was significantly faster. Sometimes i get timeouts if the response is long. Is it a matter of server capacity?

1 Like

Here is a screen recording of gpt 35 turbo and gpt 4 from the playground today:

gpt35turbo and gpt4 is about the same speed. I have a site in production that is suffering badly from the slow gpt 35 turbo API.


1 Like

I am starting to believe that OpenAI wants to get rid of the API users and focus more on the $20/ month ChatGPT subscription users. Saying this because while the ChatGPT 3.5 is very fast in its dedicated web app, it is incredibly slow on the API.

1 Like

Turbo (2013)
Theo, self-named “Turbo”, is a garden snail who dreams of being a racer, just like his hero, five-time Indy 500 champion Guy Gagné. However, his obsession with racing makes him an outcast in the slow and cautious snail community, and a constant embarrassment to his older brother, Chet.

Yes, same here. It got very slow starting 2 days ago.

1 Like

Microsoft Azure OpenAI is now 3 times faster than OpenAI’s servers on gpt35turbo. A 12 second job at azure takes 35 second at OpenAI.


I’ve seen the same thing on my side, whether its being used through chatgpt or on the api

I thought it was just me!

Badly bombed an important demo today. Was mad at my choice of using lower end EC2 instance but looks like that is not the culprit (based on log file time stamps).

You mean calling GPT-3.5-turbo via azure? Not posting to OpenAI api? Is MS offering their own api for querying gpt? Can you provide a relevant url? Thanks.

EDIT: OMG just saw this. Is there a waitlist or smth?
EDIT2: Is it still considered a custom connector therefore forcing a paid license for power apps in order to use the api?


I got my access 24 hours after I applied for it. But you’ll have to create an azure-account (at and follow the webpages on how to create a chatgpt resource. It is quite limited to 300 RPM, but it is faster than OpenAI at the moment.


fre. 28. apr. 2023 kl. 10:09 skrev tsico via OpenAI API Community Forum <>:


thanks for getting back. Is it considered a custom connector?

No. It’s a standard azure resource called OpenAI and you’ll have to define a connector that you can use with the API.

1 Like

So that means that only the maker of the app needs to have an azure subscription and if the app is distributed to a number of users none of them need to be paying for power apps premium license? :heart_eyes:

And also it broke always
filling till 25 characters

Mm… you shouldn’t integrate the API key in the app, because then people can steal it. You’ll have to set up a server that the app talks to, and then the server is doing the talking between the app and azure openai. If you’re talking about an azure app / powerapp or something like that, I’m not the right person to answer that question.


lør. 29. apr. 2023 kl. 10:30 skrev tsico via OpenAI API Community Forum <>:

1 Like

That’s like saying Amazon doesn’t want third party sellers. OpenAIs bread and butter will always be the API. It’s probably slow because everyone is getting exponentially more customers. This is all uncharted waters and very much a bunch of projects that are still in Alpha and Beta. You all also have to consider if you’re using serverless platforms. Cold starts can take forever and timeouts are more likely.

Right now it is in a closed beta stage, nothing in production.

I was wondering myself about distributing the app to users if/when the time comes.

Will need to give it some thought later on.

Thanks for the tip!!

It says NA for ChatGPT (gpt-3.5-turbo) in microsoft Azure, what does that mean ?