GPT4 and GPT-3.5-turb API cost comparison and understanding

Here is what I compiled on the API pricing for GPT-4 (8K, 32K context) and GPT-3.5-turbo (ChatGPT API)

Model Completion Prompt Context TPM RPM 1M Tokens
gpt-4-0314 $0.06/1K $0.03/1K 8192 40000 200 $1200.00
gpt-4-32k-0314 $0.06/1K $0.12/1K 32768 80000 400 $1200.00
gpt-3.5-turb $0.002/1K $0.002/1K 4096 40000 200 $4.00 shows GPT-3.5-turbo pricing as “usage” and lists only $0.002/1K tokens, but for GPT-4 lists both completion and prompt pricing for 1K each. So if someone sends a 1K prompt and gets a 1K response, aka completion, their cost for GPT-3.5-turbo is 2 x $0.002 = $0.004. Whereas the same token count for GPT4 8K model would be. 1K prompt + 1K completion = $0.03 + $0.06 = $0.09. Right?

So for ChatGPT API (GPT-3.5-turbo), a 2K token API (call/response) = $.004 vs $0.09 for GPT4 8K context model.

So ChatGPT API is 22.5 times cheaper?


Seems like it, i think this is also because gpt-4 is more advance and in beta.
Have you compared it with the price of gpt-3?

1 Like

damn it gpt4 is really expensive. Hope to see the price go down soon since the whole world is probably going to sign up.

Has Bing chat rolled out gpt4 yet? Thats for free and more responsive then the free chatgpt.

I heard that Bing Chat has been using GPT 4 since the beginning.

1 Like

yes it seems so according to Confirmed: the new Bing runs on OpenAI’s GPT-4 | Bing Search Blog
although my comparison is so far both the free chatgpt and bing chat gives same results, should query them with more complex questions.

Was in on the first round of Bing, it was ok in the beginning, hit or miss along the way and pretty much worthless yesterday. literally just gave me ad infested links to 3rd party sources for direct windows admin questions and no answers. Today? 180 degrees. Perfect straight to the point answers.

i havnt noticed any ads being given to me, or i didnt notice, it just gives few links at the bottom of each reply, which I never click, and they are nonintrusive.
I always have 2 browser windows available, one is chatgpt and other is bing chat, because I am on free chagpt i know that chatgpt may not be working many times but bing keeps working. And it does format the source code also like chatgpt which is great, i am a programmer.

In bing i noticed that is better to give more detail information with keywords as ‘always’ and ‘ignore’.
Do you use this and still get hit or miss answers

1 Like

For 4/3.5, I calculated by trying different examples and the price difference can range from
20 to 118.

That means chat-4 could be as expensive as more than 100 times of 3.5

For this question:

“Give me a brief introduction of USA history”

gpt-3.5 consumes 183 tokens totally the amount is 183 * 0.002 / 1000 = 0.000366 USD

I tried the same question with gpt-4 and it consumes 0.04326 USD.

gpt-4 returns a longer answer though with the same parameters.

Please let me know how to identify which model I am using, gpt-4-0314 or gpt-4-32k-0314? I got an API invite and I created new keys. Thanks

Once they active you gpt-4 api tokens, is it still possible to use old less expensive tokens (gpt-3.5-turbo)?

It doesn’t depend on the token you’re using, it depends on the model you’re calling with the API. See docs: OpenAI API.

So yes, it is possible to continue using gpt-3.5-turbo after getting access to gpt4. Just make sure to pass the correct model argument; e.g.

curl \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [{"role": "user", "content": "Hello!"}]
1 Like

Please let me know how to identify which model I am using, gpt-4-0314 or gpt-4-32k-0314? I got an API invite and I created new keys. Thanks

I define the model in my requests as gpt-4

In the headers I get back is: openai-model => gpt-4-0314

1 Like

I just tried, in my client, to switch the model to gpt-3.5-turbo half way tthrough a conversation and it worked fine.

My next prompt succeeded and the headers suggested I was using the gpt-3 model:

openai-model => gpt-3.5-turbo-0301

I do not think gpt4 is expensive. Try to find a team of well educated academics and pay it full time… they will not catch up, and are significantly more expensive.

Well… for GPT-4,

If you’re using it personally, it’s expensive.
For businesses, it’s an acceptable price.
In larger companies, the price is very cheap, but concerns about security start to arise. (Yes, OpenAI says it is safe, but they still do concern.)