Has anyone had an update on GPT-4’s slow API response? Mine is getting worse and it’s a conundrum since 4 is soooo much better at what I am asking it to do.
I am using the API with a key I paid for
My response times are on the order of a minute for GPT-4. It depends on how much I give it to chew on.
Using it in completions mode/style (no context) I can get quicker responses for simple queries like: “write a limerick about a dog” take about thirty seconds
Refining it in a chat context quickly starts taking time and tokens
I am happy with it. Once I realised it was not timing out and learnt to wait.
11th of May and ChatGPT-4 has become almost unusable due to poor performance. I’m not sure I can justify $20 a month for a substandard service.
its really annoying. if I had been aware of this, i wouldn’t have upgraded. I upgraded due to the need for better performance but this is annoying!
Does anyone still experience errors with GPT4?
I just started paying for GPT 4 and it SUCKS!! Slow. Times out all the time. Doesn’t respond anything like as well or fast as the free version. Why I’m I paying for a beta version that is so full of bugs and barely works???
JanSch
50
Here the same. Paying for it and its unusable.
atpoint
51
Seconding all what has been said above. Actually, if there is not going to be a fix very soon for poor GPT4 generation speed I will be cancelling subscription. It’s borderline unusable for more demanding interactions that require lots of text to be generated, and GTP3.5 does it an order of magnitude faster. Is here someone from the core team to actually comment on this? Seems like paying members are talking to a wall here.
1 Like
_j
52
GPT-4 is an increased computation model by design. It was even slower at release and full intellectual power.
You can use your high-end consumer hardware to run an open-sourced language model at the size limit of what your computer can do - and get far lower token output per second even when it is the only thing running. Increase the cost per 2TB/640GB H100 inference server to around $250,000, and increase the cost of a rack instance of GPT4 to $1 mil that you get access to for a minute of supercomputer computation to answer about silly ASCII art.
Because it looks simple doesn’t mean it is simple.
aa2
53
Experiencing really slow output / response times on GPT-4. Has been the case on and off (mostly on) for the last 3 months. Has anyone been able to figure out if this is a browser level problem or if its OpenAI’s servers?
Pondy
54
Still painfully slow on occasions for me. I changed tabs, carried on with some other work bits and then switched back to GPT after ~30s and it was still typing the response out, 1 character at a time.
Then 5 minutes later and it’s back to normal. Rinse and repeat.
Pondy
55
It’s defintely them as per my previous comment.
It’s slow one minute, and then 5 minutes later it returns to normal, suggesting an overload on their end.
niddle
56
On my side it horrible slow atm as well…no idea what causes this…is it in my side? Cleared cache etc. and reinstalled plugins…used it via webinterface not via plugin…nothing helps…somethimes plugins are not loaded or not recognized - a minute later they are - what is this mess I am paying for all about ??? Plugins are not working at all right now…only w/o plugins it works…
1 Like
I ordered chatgpt plus because i think it will be faster, but very very SLOW too.
2 Likes
painfully slow continuously since they upgraded the UI a few days ago. can’t be used anymore.
1 Like
GPT4 on my side is so much slower than couple days ago. Hard to use
1 Like
Yes, same! Say me, please - do you find how to fix it?
Hello! Say please - do you fix it?
Same here, very slow since yestarday barely finishes generating answers, stop analyzing files halfway. Impossible to provide any analysis even after generating answers over and over
3 Likes
joshuad
63
Yup, really slow since at least around 3-4 hours ago today, and a much higher chance of network errors than usual (probably as a result of the slowness).
1 Like