Hi there, I ran a test with ‘gpt-3.5-turbo’ and I had 403 API requests totaling a token count of 142k context/input and 73k generated. The bill says $0.36.
The Pricing says ‘gpt-3.5-turbo-0125’ costs $0.0005/1k input and $0.0015/1k out. With these numbers I arrive at $0.16 and not $0.36. To $0.36 I get if I use the numbers for the ‘gpt-3.5-turbo-instruct’ ($0.0015/1k, $0.002/1k).
What really confused me was that I can not call the OpenAI endpoint with ‘gpt-3.5-turbo-0125’, it returns
ValueError: Unknown model ‘gpt-3.5-turbo-0125’. Please provide a valid OpenAI model name in:
[…]
gpt-3.5-turbo
gpt-3.5-turbo-16k
gpt-3.5-turbo-1106
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k-0613
gpt-3.5-turbo-0301
gpt-35-turbo-16k
gpt-35-turbo
gpt-35-turbo-1106
gpt-35-turbo-0613
gpt-35-turbo-16k-0613
[…]
So what is going on here? Is the pricing homepage outdated or is the backend not up to date with the homepage or am I doing something wrong?
Many thanks to anyone you has an idea!