Sometimes API calls take more than 60s or even up to 100s. Are there any specific reasons for this case?
Nah, that is just normalā¦ You might want to get Enterprise API access to get priority or go to azure and deploy you own openai model there to get production ready deployments.
The normal openai API is just for small projects and for model evaluation - at least thatās how I understand it.
Ah, welcome to the developer community.
Thank you for your prompt response and warm welcome! We are now using Tier 4 APIs. Such high latency occasions occur at a very low percentage but we still would like to find out the root cause. Thank you again for your suggestions, we might prioritize Azure Open AI .
I am getting < 3 second responses there every timeā¦ Highly recommend to got to azure for professional applications.
Keep in mind that azure has different policies for usage - mostly less strict as long as you promise to use it responsibly.
And in case you have a startup there are other options as well.
Welcome to the community!
This seems way too long to just be normal delays, unless you got unlucky when the servers were at extremely high capacity.
Would you mind sharing a codesnippet of where you send the request?
Another thing you can try is the Playground and see if responses are faster there, though if it is only a small percentage of requests, that might be a bit annoying.
Cheers!
The post said āsometimesāā¦ and that is normal on openai serversā¦ on average it is lessā¦
ChatGPT is running slow right now. I was working and wanted to generate code for a similar calculator-related tool, but its response time is very slow all the time. Is there any issue? Please let me know, and tell me how this type of tool, like calculadora de horas, can help me generate the code.