Huge Latency since 03/22/2023 on Make

Hello, for the past 2 days, OpenAI module can be stuck several minutes to return 1 result.

Do you have the same?

2 Likes

Yeah, I’ve been having the same issue. The latency has increased massively. My suspicion is the OpenAI has grown so rapidly and large enterprise organizations are integrating their models into their workflows that OpenAI has prioritize large enterprise organizations and their API calls. Leaving the rest of us with less server resources and as a result, slower latency.

Keep in mind this is 100% speculation

2 Likes

I’m going to keep in mind that OpenAI is not reliable.

Please keep in mind that ChatGPT went from dozens to 100+ million users in a matter of days… the biggest growth in all of human history for any app.

That they’re keeping the servers up at all is tremendous in my eyes.

Be patient. I’m sure the reliability will improve as time goes on.

I’m pretty sure ChatGPT-turbo is actually a smaller model with better results which is why it returns so fast… and it’s cheap! GPT-4 on the other hand, has a huge context window and requires a lot of resources to run, hence latency problems with 100+ million users.

Where did you take those statistics from? Moreover, a company like OpenAI should have thought about this and avoided it.

ChatGPT does not work for me. A lot of timeouts and connection problems. You literally cannot rely on OpenAI. You cannot even contact them. Even if you send them an email, after 1 month, you get no answer. Basically, they don’t care about you.

The API’s slow response in recent days is problematic. My work relies on the service’s timeliness and correctness. I hope OpenAI solves this issue quickly.

Damn…

Is there someone from OpenAI team here to let us know?

My mistake… It was 1mm users to 100mm users in two months…

Well, I was thinking that I shouldn’t trust any source other than the official one from OpenAI but then I remembered that I can’t trust even that one. I can also speculate. Even if it’s the fastest-growing app, I am sure they thought this could be the case, given the hype they generated.