Text-davinci-003 api pricing?

I am using the text-davinci-003 model in Python.

response = openai.Completion.create(

I am finding the amount of tokens via: print(response.usage.total_tokens). I cannot find for the life of me the pricing for text-davinci-003 model, only some forum I read that it is 0.0120 / 1000k tokens. so based on this calculation, for my most recent request at 403 tokens, I should be being charged 0.004884 for the request. however, when I go into my billing dashboard, I see it went up by 1 cent. just trying to figure out the total cost. Thanks for the help in advance.

1 Like

Did you count the request tokens + completion = total_tokens?


text-davinci-003 $0.0200 / 1K tokens

403 \text{ tokens} \times \$0.02 / 1000 = \$0.00806

It is reasonable to think there may have been some fractional amount previously billed or simple rounding caused the amount used to de displayed as an additional cent.

You absolutely are right to expect billing to be fair and transparent, but

Worrying because your displayed usage went up one cent after using eight-tenths of a cent, is a whole nother level of penny-pinching.

You’re fine, the billing is fine, when you’re dealing with low usage you should expect the amount shown in your billing dashboard to not be sub-cent accurate.

1 Like

sorry, what do you mean? I just assumed that response.usage.total_tokens was the total tokens from the response of the completion request.

Ah, ok, then please have a look at @elmstedt’s reply. Seems like everything was ok.

1 Like

interesting, can’t believe I didn’t find this before. thanks!

1 Like

Afternoon. ChatGPT-3.5 suggested to use the “text-davinci-004” model to the API to analyze my 3000 token scientific text. Under https://platform.openai.com/docs/deprecations I can’t find the price unfortunately. Please help - provide a price list for using the “text-davinci-004” and “davinci-codex” model. Thank you for your time and help.

These models do not exist. ChatGPT is hallucinating.

All of the models and their prices are listed here,