ChatGPT API responses are very slow, even for short API calls with 200-400 tokens take 20-30 seconds. Is there any way to make the response faster.
Hi @nandha
Yes, things are slow based on the demand. I just checked for you by sending 300 words of lorem epsom text to the chat completion
API for you and got these results:
gpt-3.5-turbo-0301
Total Tokens: 826, Completion API Time: 16.17 seconds
Total Tokens: 866, Completion API Time: 14.434 seconds
Total Tokens: 1313, Completion API Time: 38.629 seconds
I donât think there is much you can do at the moment as the issue is with the performance of the turbo
model(s.) You could switch to another model, which have tested to be faster than turbo
these days.
HTH
Appendix: Example Completion
hi ruby_coder,
im using api and model gpt 3.5 turbo too. but the response is very slow. im calling api by python.
Yeah, is it slow, for sure now, I tested again for you now, completion time was nearly 22 seconds:
My advice is to relax and do something less frustrating until the issue on the OpenAI infrastructure side improves, if you can.
HTH
Yes, may be âturboâ it´s a little bit âpretenciousâ adjective for this model
I´m using curl with PHP on 500 tk max environment and the answers takes arround 30-50 secs to get ready.
Mine, too⌠Iâm also having connection errors like this.
openai.error.APIConnectionError: Error communicating with OpenAI: (âConnection aborted.â, ConnectionResetError(104, âConnection reset by peerâ))
Same slowness here, plus occasional 502 Bad Gateway
responses after a long wait.
Sadly, the API is throttled for normal paying users. And at the moment we are getting also a lot of errors. Not very usable in the current state and we hope OpenAI will find a solution soon.