🐢 : GPT4 extremely slow on GPT4 API and ChatGPT

Hi, my GPT-4 API calls have been extremely slow since today. I am experiencing the same issue with ChatGPT (GPT-4). It seems to be linked to my account because when I test the speed using another account, GPT-4 responds normally. I haven’t reached the maximum quota for this month. Could it be related to a quota issue? Are there any other possible reasons?

Thanks for your help

The same here. Timeout every times after 5min. I need to launch 5 times to get 1 and i pay for 5, really amazing ! What is your plan to solve that OPENAI ?

2 Likes

Yup, this is happening to me as well. Also seems to be specifically connected to my account.

1 Like

They have issues in their servers, I don’t work for OpenAI but it usually happens when they release new features, and plugins probably got everyone excited and got jumping on, which uses GPT4 in the background.

I hope they solve it soon.

1 Like

Yes, it was happening to me also. So I changed my backend response to SSE (Server send events) so that my users should know that something is going on.

1 Like

Overall average today for GPT-4 is 30s, but a few spikes around 6 minutes:

3 Likes

Impressive! May I ask where you obtained this data from?

1 Like

This is from the monitor in AWS hooked to the Lambda function that makes calls to the GPT-4 API.

So my own personal view of performance from my API calls.

1 Like

Here’s our API performance for GPT-3.5-turbo over the last month:

The data are somewhat noisy, but clearly an upwards trend. The type of inference hasn’t changed and the size of each request hasn’t changed.

2 Likes

I wonder what the solution is, especially if you sre trying to build a business around it.

1 Like

Well, it’s not clear that there is a solution yet.
I’m hoping the OpenAI folks will pull through.
I’d be OK paying, say, 10x more for a 4x faster product with some kind of latency guarantee.

I’m also looking at other models, both self-hosted, and competing providers, but it’s somewhat slim pickings for now … Especially until Google actually lets people use that PaLM model they “released” twice already.

Clearly, there’s demand for what OpenAI are doing, and the main challenge is meeting up to that demand.

1 Like

Greetings! I use GPT 4 along with Dall e 3 for creating artistic images. Lately, I’ve noticed that it’s awfully slow. I thought it was due to the long chat I’m having creating my next project, but I see I’m not the only one this happens to. Once I enter the prompt I have to log out to log back in and see the response. In addition, at the beginning it offered 2 images among the results, but it has been limited to one per request in recent weeks. Limewire and Bing also support dall e 3, but it is not the same as using GPT 4 support.

I’m out OpenAI! You ask 20$ for a *** slow service. It’s not even a service anymore since almost all queries fail.

2 Likes

March 2024 here – it has become more slow recently, often freezing and allowing no input at all, or not completing answers and then freezing.

Based on what I’ve seen in other tech firms that quickly gain customers, OpenAI isn’t supporting the current version’s customer issues with enough staff, $ and expertise. Not worth $20 when there are other options in the market.

2 Likes

Probably cause lots of people use it. That’s why

I’m experiencing the same now. GPT-3.5 is quite fast & stable but GPT-4, if the response less than 30s then ok but more than that nothing happens :frowning:

What I find frustrating is the complete lack of communication from their side.

I feel surprised there aren’t more people commenting on this issue. I have seen a significant decrease in response time as well as increase in failures to create responses across both GPT4 and 3.5 in the past two weeks. I use it as a great resource to compliment my academic learning as I can seek clarification, additional detail, or analogies on topics and information. I’m hopeful that it won’t be long before I am able to verbally converse with AI rather than type. I came here for answers on the current system lag and failures, but there doesn’t appear to be anything on that end identified in this stream. I’ll keep an eye/ear out, I suppose. But yes, it is bothersome to see this issue when we are paying for the most efficient version.

1 Like

Also having this issue, and have been. ChatGPT 4 responses are taking upwards of 5 minutes to complete, if they even complete without erroring. Probably going to have to drop my subscription for next month if they can’t figure something out. ChatGPT 4 is no longer a worthwhile product if it can’t even manage to give a full response. I reached out to support last week and have gotten no response, to which I’m disappointed, but not surprised by.

1 Like

When chatGPT is slow due to maintenance or high traffic, I use chatGPT at MiniToolAI . Com