šŸ¢ : GPT4 extremely slow on GPT4 API and ChatGPT

Hi, my GPT-4 API calls have been extremely slow since today. I am experiencing the same issue with ChatGPT (GPT-4). It seems to be linked to my account because when I test the speed using another account, GPT-4 responds normally. I havenā€™t reached the maximum quota for this month. Could it be related to a quota issue? Are there any other possible reasons?

Thanks for your help

The same here. Timeout every times after 5min. I need to launch 5 times to get 1 and i pay for 5, really amazing ! What is your plan to solve that OPENAI ?

2 Likes

Yup, this is happening to me as well. Also seems to be specifically connected to my account.

1 Like

They have issues in their servers, I donā€™t work for OpenAI but it usually happens when they release new features, and plugins probably got everyone excited and got jumping on, which uses GPT4 in the background.

I hope they solve it soon.

1 Like

Yes, it was happening to me also. So I changed my backend response to SSE (Server send events) so that my users should know that something is going on.

1 Like

Overall average today for GPT-4 is 30s, but a few spikes around 6 minutes:

3 Likes

Impressive! May I ask where you obtained this data from?

1 Like

This is from the monitor in AWS hooked to the Lambda function that makes calls to the GPT-4 API.

So my own personal view of performance from my API calls.

1 Like

Hereā€™s our API performance for GPT-3.5-turbo over the last month:

The data are somewhat noisy, but clearly an upwards trend. The type of inference hasnā€™t changed and the size of each request hasnā€™t changed.

2 Likes

I wonder what the solution is, especially if you sre trying to build a business around it.

1 Like

Well, itā€™s not clear that there is a solution yet.
Iā€™m hoping the OpenAI folks will pull through.
Iā€™d be OK paying, say, 10x more for a 4x faster product with some kind of latency guarantee.

Iā€™m also looking at other models, both self-hosted, and competing providers, but itā€™s somewhat slim pickings for now ā€¦ Especially until Google actually lets people use that PaLM model they ā€œreleasedā€ twice already.

Clearly, thereā€™s demand for what OpenAI are doing, and the main challenge is meeting up to that demand.

1 Like

Greetings! I use GPT 4 along with Dall e 3 for creating artistic images. Lately, Iā€™ve noticed that itā€™s awfully slow. I thought it was due to the long chat Iā€™m having creating my next project, but I see Iā€™m not the only one this happens to. Once I enter the prompt I have to log out to log back in and see the response. In addition, at the beginning it offered 2 images among the results, but it has been limited to one per request in recent weeks. Limewire and Bing also support dall e 3, but it is not the same as using GPT 4 support.

Iā€™m out OpenAI! You ask 20$ for a *** slow service. Itā€™s not even a service anymore since almost all queries fail.

2 Likes

March 2024 here ā€“ it has become more slow recently, often freezing and allowing no input at all, or not completing answers and then freezing.

Based on what Iā€™ve seen in other tech firms that quickly gain customers, OpenAI isnā€™t supporting the current versionā€™s customer issues with enough staff, $ and expertise. Not worth $20 when there are other options in the market.

2 Likes

Probably cause lots of people use it. Thatā€™s why

Iā€™m experiencing the same now. GPT-3.5 is quite fast & stable but GPT-4, if the response less than 30s then ok but more than that nothing happens :frowning:

What I find frustrating is the complete lack of communication from their side.

I feel surprised there arenā€™t more people commenting on this issue. I have seen a significant decrease in response time as well as increase in failures to create responses across both GPT4 and 3.5 in the past two weeks. I use it as a great resource to compliment my academic learning as I can seek clarification, additional detail, or analogies on topics and information. Iā€™m hopeful that it wonā€™t be long before I am able to verbally converse with AI rather than type. I came here for answers on the current system lag and failures, but there doesnā€™t appear to be anything on that end identified in this stream. Iā€™ll keep an eye/ear out, I suppose. But yes, it is bothersome to see this issue when we are paying for the most efficient version.

1 Like

Also having this issue, and have been. ChatGPT 4 responses are taking upwards of 5 minutes to complete, if they even complete without erroring. Probably going to have to drop my subscription for next month if they canā€™t figure something out. ChatGPT 4 is no longer a worthwhile product if it canā€™t even manage to give a full response. I reached out to support last week and have gotten no response, to which Iā€™m disappointed, but not surprised by.

1 Like

When chatGPT is slow due to maintenance or high traffic, I use chatGPT at MiniToolAI . Com