What are the reasons for free ChatGPT becoming slower?

Yes, I know, free ChatGPT performance depends on the load on the server, but are there any further statements or observations on throttling rules available? E.g., a dashboard with the current load on the OpenAI servers? Has ChatGPT become slower for any others for the last one or two days? I am much more frequently seeing “Error in message stream” errors than before, and typing speed has slowed down to 0.5-1 words per second. Is ChatGPT load currently higher than usual?

Or are there any other throttling rules? For instance, a few days ago, I copied really large amounts of text into ChatGPT. Is this a coincidence or does OpenAI apply throttling in any form that is based on the tokens sent to ChatGPT per day/week/month?

And maybe as an extra question to ChatGPT Plus subscribers, how frequently (if at all) do you experience slow performance of ChatGPT?

It’s quite fast for me (on free plan, previous subscriber), but that’s anecdotal. You can actually look at sampled speeds on the API (not ChatGPT) on some sites like OpenAI API and other LLM APIs response time tracker

There seems to be regular spikes of GPT 3.5 slowness while GPT-4 seems to be getting faster, at times about half as fast as 3.5. Seems like whatever 3.5 is running on is fairly inconsistent. Azure is performing better for some reason; thought they were the same thing.

1 Like