Chatgpt takes a long time to think and show the whole response at once after 30 to 60 seconds. It used to write the response immediately for me. I noticed this since upgraded to plus account
I do not have a Plus subscription so can not give feedback as such but can point out in ChatGPT — Release Notes to notice the words Turbo and faster with regards to the Plus version.
Release Notes (Feb 13)
We’ve made several updates to ChatGPT! Here’s what’s new:
- We’ve updated performance of the ChatGPT model on our free plan in order to serve more users.
- Based on user feedback, we are now defaulting Plus users to a faster version of ChatGPT, formerly known as “Turbo”. We’ll keep the previous version around for a while.
- We rolled out the ability to purchase ChatGPT Plus internationally.
Release Notes (Feb 9)
As we recently announced, our Plus plan comes with early access to new, experimental features. We are beginning to roll out a way for Plus users the ability to choose between different versions of ChatGPT:
- Default: the standard ChatGPT model
- Turbo: optimized for speed (alpha)
Version selection is made easy with a dedicated dropdown menu at the top of the page. Depending on feedback, we may roll out this feature (or just Turbo) to all users soon.
I tried both models, it does not help. I think the problem is not at the model.
I also tried on mobile, it seems it works fine in mobile browser but not in desktop browser. I tried all different browser in desktop and had the same problem. Clear cache/ login logout etc. I tried everything I know but still the same issue on desktop.
I think I know why. It seems that my company VPN is buffering text/event-stream
response and only returns this buffer when the stream is ended.
So the problem is on my side, but is there anyway to mitigate this issue? I meant I don’t have this problem with Bing AI or some other streaming content, so I think definitely there is a way to mitigate this issue.