Difference in cost between a ChatGPT request made from the OpenAI playground and one made from a request to the OpenAI web API?

TL;DR Do ChatGPT questions asked at the OpenAI playground cost the same as questions sent by my web app to the Open AI web API?


Hi all. Yesterday I used the OpenAI playground to submit ask a particular question to ChatGPT (gpt-3.5-turbo) and obtain a response. The costs as reported by the usage page were 178 total tokens, and $0.000271.

Let’s say I ask the exact same question to ChatGPT (using the same model), and it gives me the exact same answer. Only instead of using the OpenAI playground to submit the question, I write code to make a network request to the OpenAI web API.

  • Can I expect the question/answer to the web API to cost the same, in tokens, as my OpenAI playground questions/answer? My educated guess is ‘yes,’ that tokens calculation doesn’t vary based on how the question is submitted. But just want to confirm here.

  • Can I expect the question/answer to the web API to cost the same, in dollars, as my OpenAI playground questions/answer? If not, how might it vary?

Context: As a side project I want to build a web app which will ask ChatGPT many questions via the web API. Right now I want to estimate how much those requests will cost me, financially.

Thanks in advance.


OpenAI Playground usage data

  • gpt-3.5-turbo-0301
  • 170 prompt + 8 completion = 178 tokens
  • $0.000271

Hi cagross,

ChatGPT is free.

You can get a Plus subscription for priority access and access to new beta features as they are released, early. This is charged at $20/mo + local VAT tax rates.

The OpenAI GPT API endpoints are priced at various levels for prompt and completions (input/output) details of which can be found here Pricing

1 Like

I don’t think ChatGPT does only utilize just one model e.g. GPT-4 (is GPT-4 even just one model?).
And if I understand it correctly you want to scrape ChatGPT?

1 Like

Playground is just a UI over the API, if you look at network traffic it’s calling the same API endpoint you would use in your projects. So yes Playground and API should cost the same.

2 Likes

Yes, the playground requests and requests you issue yourself using curl, python, node, or any other API client, work exactly the same.
The playground is just a graphical interface on top of the API.

1 Like

Hi @cagross

ChatGPT is an OpenAI product at chat.openai.com

In case you’re trying to refer to the Chat Completion models’ pricing, you can visit the pricing page.

Playground is a quick and easy way to explore the OpenAI API, the pricing is going to be the same as making API calls from your own code - unless the playground is doing additional processing.

2 Likes

@novaphil @jwatte OK thanks for that info. That’s exactly what I wanted to confirm.

@sps

unless the playground is doing additional processing.

Can you give me an example of what you mean by this?

Can you expand on this? As I understand it, every request sent via chat.openai.com or the OpenAI playground requires users to first create an OpenAI account, then after, requires tokens. New OpenAI accounts are given X tokens to use for free (for a limited time), but after those tokens are expended (or expire), new tokens come at a monetary cost. What am I misunderstanding?

ChatGPT (chat.OpenAI.com) is separate from API/Playground and doesn’t have per token costs. ChatGPT is free (or $20/month for GPT4 and plugins).

3 Likes

Ah OK thanks very much for that clarification. I wasn’t aware, and it’s very helpful.

If that reply answered your question then consider making it as a solution by clicking
image

2 Likes