Why is the speed of using API faster for free trial accounts than for paid accounts?

On the website that tracks OpenAI API response times, I saw that the free trial is faster than the paid account. I tried it myself and found that it is indeed true. Why is this so? Is it unfair to paid users? Moreover, there is a significant speed difference, with the free trial being on average 2-3 times faster than paid users.

2 Likes

This is not true, there is no inherent speed difference in that free trial vs paid account. If the website has something that shows different speeds it’s likely because they wrote some benchmarking code that is incorrect.

3 Likes

I can prove that the following is my testing using chatbot-ui with GPT-3.5-turbo. On the left is my own API key, with a paid account and ChatGPT Plus subscribe. On the right is my friend’s API key, which is not bound to a credit card and has a free credit of $5.

CleanShot 2023-05-06 at 00.17.37

2 Likes

There is no difference in the speed on the API side.
Both your accounts are likely running on different datacenters. The resource constraints affect different users at different times.
Your network latency also makes a huge difference in the speed of the data stream.

I also found the same problem. I used to have two accounts, one for free and the other for payment. I used two accounts on the same machine to call OpenAI and clearly found that the free account was much faster than the paid account. After I tied my free account to a bank card, the account also slowed down.

I hope the openai staff can explain

4 Likes

I have conducted tests before and found that a free account is indeed much faster than a paid account. I tested two accounts on the same machine at the same time. Every time it’s like this

2 Likes

I switched over to paid yesterday and noticed my Siri shortcut takes longer to generate the gpt3.5 response than when i was using my $18 of credits

2 Likes

Why say anything without any knowledge? Have you tested this? Because it’s clearly a thing. It might not be intentional or said anywhere which is why bringing these things up are good so they can fix the problem if it is a problem

Judging from this result, it is true that free time-consuming is relatively short. Can someone please explain?

2 Likes

I wanted to say “There’s no difference”

… but since people are experiencing it I’ll just be quiet and follow the thread…

1 Like

We have the exact same issue. Our ChatGPT 3.5 Turbo API calls were taking a VERY long time, so on the exact same machine with the same code we just replaced our API key with a free trial key. The response time is easily 4 to 5 times faster. This is a serious problem – can anyone at OpenAI can give an update? We’re getting ready to pause development of our product because of it and I know there are similar discussions going on at other companies having the same problem. Many thanks for any feedback.

3 Likes

I am also facing the same issue. Thanks for bringing this up.

2 Likes

We are facing the same issue. Any update to this issue?

1 Like

Hello,
We are continuing to experience the same issue. Our benchmark tests were conducted using the same codebase, with the only variable being the type of API key used. One key was for a free trial account, while the other was for a paid account.

1 Like

We are facing the same issue after upgrading to a paid account. A prompt that usually takes 9 seconds is taking 29 seconds now. They are other prompts that takes longer and times out the API gateway in our end.

We tried the same prompt with two different keys - one paid and one free. For sure, the paid account is consistently slower than the free account.

It’s highly disappointing that paid account is providing an inferior/sub-standard quality than free tier. We evaluated using the free tier and upgraded to paid account and now we faced with show-stopper issues.

So far, there’s absolutely no response from OpenAI team and we are stranded.
@OpenAiSupport @logankilpatrick please help us to move forward.

1 Like

I am currently in the same situation. I haven’t been able to find the reason why the free account is faster. :smiling_face_with_tear:

1 Like

@logankilpatrick
There are a lot of posts here, and in other threads that point to your claim as being false? Benchmark timings like the unix time utility or curl response times have been around for a very long time and have been tested by a large number of people. Exactly what are you claiming here? I’m simply running a curl command over and over and adding up response times to get a min,max,mean,stdDev. The difference between response times from the dev apiKey and the paid-for key is huge.

What benchmarking code do you recommend?

The current theory is that customers get allocated to some kind of clustered resource when they make a paying account, and those clustered resources sometimes suffer from noisy neighbor syndrome, and sometimes doesn’t.
And this may vary both per-cluster, and per-time.
The chat API is allocated to other clustered resources, which have different levels of overload.

It may be that if you create a new paid account, and run timings on that account, it may see different numbers, simply because it gets allocated to another cluster.

At least, that’s the theory.

@logankilpatrick
Hi, I have the same problem, my paid account i x4 slower then free Api. What can you ( openai ) do to speed thing up. I dont want to create 4 free account and switch api key in my code so I never reach “tier 1”. I want to pay for your services.

I believe Logan has turned off notifications. This is not a one stop “ask an employee center”.

It is logical to see why an “advertisement” may be more performative. The question is answered by the tier system and the reports of its effects on those not paying enough to get to tier 2+.

Some money grubber (hopefully one ousted) decided that if you were not making OpenAI $250+ per year upfront, you were to to be downgraded to punishing levels of poor service.