I was trying to increase the rate limit from 3 requests per minute for free users by adding a credit card to become a Pay As You Go user. However, the API response became significantly slower - from about 20 seconds for before adding credit card to about 60 seconds or more. incredibly that is strange. To confirm the issue, I took the another account to added a credit card and compared the speed, same result. Has anyone else encountered this problem?
Interesting could be another step for paid accounts that cause the delay. Maybe the dev team could answer.
i got the same issue!
i compared my free api key and my billing api key ,
later is much slower than former.
the same input, free key takes 2s while payed key takes 5s
I did not feel that change, maybe there are diferent models? O normal saturation?
i tried this for several different times, all shows billing api key is slower than free one.
i used gpt-3.5-turbo model.
Same model and input. it is gpt-3.5-turbo on ‘chat/completions’ api. my requirement is to generate articles about 400 words according to input. my asks has a fixed format that is "TITLE, no less than 400 words. "
I also tried this on different times, but the situation remains the same. This is too ridiculous for becoming a paid account.
Yes, I found this anomaly on my side as well.
OpenAI’s API response generation speed is: free account (no PLUS, non-paid account)> paid account (no PLUS, paid account), > paid account (with PLUS, paid account)
We are also among many people who complain about the same situation, but unfortunately, OpenAI does not provide any solution.
The same problem here. The free api is extremely faster than the paid one. It seems the paid api from Azure OpenAI Service responses fastest now.