Api change slower after adding credit card

I was trying to increase the rate limit from 3 requests per minute for free users by adding a credit card to become a Pay As You Go user. However, the API response became significantly slower - from about 20 seconds for before adding credit card to about 60 seconds or more. incredibly that is strange. To confirm the issue, I took the another account to added a credit card and compared the speed, same result. Has anyone else encountered this problem?

3 Likes

Interesting could be another step for paid accounts that cause the delay. Maybe the dev team could answer. :slight_smile:

1 Like

i got the same issue!
i compared my free api key and my billing api key ,
later is much slower than former.
the same input, free key takes 2s while payed key takes 5s

2 Likes

I did not feel that change, maybe there are diferent models? O normal saturation?

1 Like

i tried this for several different times, all shows billing api key is slower than free one.
i used gpt-3.5-turbo model.

1 Like

Same model and input. it is gpt-3.5-turbo on ‘chat/completions’ api. my requirement is to generate articles about 400 words according to input. my asks has a fixed format that is "TITLE, no less than 400 words. "

I also tried this on different times, but the situation remains the same. This is too ridiculous for becoming a paid account.

是的,我這邊發現也出現這種異常情況。
OpenAI的API響應生成速度是:免費賬號(無PLUS,非付費賬號)>付費賬號(無PLUS,付費賬號)>付費賬號(有PLUS,付費賬號)

Yes, I found this anomaly on my side as well.
OpenAI’s API response generation speed is: free account (no PLUS, non-paid account)> paid account (no PLUS, paid account), > paid account (with PLUS, paid account)

We are also among many people who complain about the same situation, but unfortunately, OpenAI does not provide any solution.

The same problem here. The free api is extremely faster than the paid one. It seems the paid api from Azure OpenAI Service responses fastest now.

We are facing the same issue after upgrading to a paid account. A prompt that usually takes 9 seconds is taking 29 seconds now. They are other prompts that takes longer and times out the API gateway in our end.

It’s highly disappointing that paid account is providing an inferior/sub-standard quality than free tier. We evaluated using the free tier and upgraded to paid account and now we faced with show-stopper issues.

So far, there’s absolutely no response from OpenAI team and we are stranded.
@OpenAiSupport please help us to move forward.

Our company has decided to use the paid services of OPENAI. During the tests on the free account, everything worked quickly, after connecting the payment card to the account, it slowed down significantly.
A text of 1500 characters was generated in 25 seconds on the free account and 60 seconds on the paid account.
It is a real scandal to treat customers who decide to use paid services in this way! Encourage quick action during testing and then impose restrictions. We are very disappointed with OPENAI’s attitude

If your type of application allows it you may use parallel processing.

e.g. split the text with an outline and create three texts of 500 in parallel might be faster than 1500 in one.

Hope that helps.