It was working well before Sunday, had no issues. Also, I just tried the stream, and the result is the same. 530 tokens in 68 seconds, without any parallel call. Definetly, there are issues on the API endpoint.
To compare, the same query in Chatgpt Free Web app, it took 14 seconds.
3 Likes
I never suggested that streaming causes tokens to generate faster.
I was strictly discussing preventing server timeouts from long wait times.
2 Likes
zenz
35
Not the first time we’ve seen system strain via API and ChatGPT on a Monday. My understanding is that OpenAI is losing money on a number of the offerings at current prices, so they are likely up/down scaling as needed, based on expected usage…
If I were any of the folks doing product demos on a Monday morning – might be smart to hold off until mid-week. 
2 Likes
_j
36
My own wild guess is that in the climate of GPU shortage for all the AI demands, Microsoft isn’t giving up its own Azure compute to OpenAI; instead, using GPT-4 ridiculously on Bing searches locked to their own Edge browser to try to make a browser and search engine nobody wants a thing.
5 Likes
zenz
37
100% this seems like a compute resources issue one way or the other.
3 Likes
Im getting slow responses plus Im getting GPT 3 response instead of 3.5 turbo. Something happened today. I dont know what. Pretty stupid!
2 Likes
Buy $50 credit and become Tier-2, guys. I’ve fixed the slow GPT3.5 issue with that.
2 Likes
I think the idea that streaming would help you to avoid timeout of 30 sec running your server script is wrong. Although you start sending the data sooner, the timeout will hit you after 30 seconds and the data that have not been generated by the API will be lost.
1 Like
Try it out and let us know how it goes.
1 Like
Yeah, I am getting a 503 error. And Chatgpt free version have no issue here in Bangladesh.
2 Likes
I am having delays and errors as well…while losing money basically.
1 Like
We are getting 503 Errors in Goa India since 10 am this morning.
1 Like
Facing same issue since last couple of days. It has taken the delay times up to few minutes from few seconds. Using it in India with turbo-3.5 model. Not something we expect from openAI.
1 Like
I’m having same issues. API got so unreliable and slow that any tools based on it are useless at the moment.
1 Like
I solved the issue after reading one of comments. I bought $50 credits with OPENAI company account, and the problem was solved, and the queries were faster, more reliable and with a better Chatgpt model. Crazy !
1 Like
Buy $50 credit, this is how I solved my issue.
1 Like
I bought $50 credits with OPENAI company account, and the problem was solved.
1 Like
I bought $50 credits with OPENAI company account, and no issues anymore.
1 Like
Please do not add on to this topic unless you have tried the solution:
2 Likes
Further to my earlier posts yesterday, this seems to have come right and is back to normal for me for the past several hours. I did not add credit nor change any billing/plan. HTH.
1 Like