Chat GPT turbo performance when using it with a Chat GPT plus membership VS using it via the API

I currently have a chatGPT plus account which gives me faster response times and good availability even during peak hours.

I was wondering whether the response times and the availability is the same when using the API to chat with ChatGPT turbo. Is it just as available and are the response times just as fast or is there still a difference.

I also have a question about ChatGPT4, In the Plus interface I can only use it for an X amount of times every 3 hours. Does the same limit apply when using the API to communicate with ChatGPT4?

1 Like

ChatGPT and the API are tracked separately and aren’t linked together. The API is billed by usage, no 3 hour limit, up to your set spend limit. I believe the current default is $120 per month, but you can request to increase this.

The API also has quotas depending on the model and what your account is granted. My “chat” endpoint rate table is here:

My tokens per second average output over the last three days with GPT-4 is 7 tokens of output per second (or 7 TPS). This has increased since I started using GPT-4, which was 5 TPS when I started.

The Output TPS varies day to day and even hour to hour.

Here is a general website that trends this:

4 Likes