I think I'm being charged more than I should

I think I’m getting charged more? I’m using the openai api on a site and I’m using gpt-3.5 turbo. I barely used 1k tokens but I’m already at $0.24 when I should be at $0.002. Is there something I’m missing?

1 Like

Take a look here and see what the system thinks you’ve used OpenAI API

You can report any inaccuracies to https://help.openai.com and use the chat bot in the bottom right corner.

Could it be this? Why am I seeing duplicate charges? | OpenAI Help Center I literally haven’t even used 750 words and it’s already $0.24 when it should be at most $0.002. Help?

That article is about ChatGPT Plus, so not relevant.

On the Usage page, select a day in the Daily Breakdown and you can see all the requests and exact number of tokens you are being charged for. Better than estimating word counts.

Also, just to be clear, you’re using gpt-3.5-turbo, not text-davinci and not gpt-4?

So I checked and I can’t see the number of tokens I used anywhere, and yes I’m 100% sure I’m using gpt-3.5-turbo.

Usage page, then select a day, click the Language Models, click the time.

Is this normal?

I didn’t even talk that much, is this how it works?

have you given your API key to any 3rd party applications or installed any ChatGPT extensions on your browser?, the token usage shown there is representative of fairly large conversations, 17,000 tokens is about 12,750 words (25 pages of text) that would be typical of extended chats with the AI over perhaps an hour or more, or automated testing usage where repeated queries are being performed.

From a developer standpoint those are fairly low numbers for an active project, but if you only asked 1 or 2 very short questions… then that is unusual.

Bare in mind that a full context chat can be 4000 tokens per prompt once the history has grown, so it can add up.

I’m using the openai api on a site

Which site are you using? Depending on what the site does it might be sending up a lot of information in the prompt.

1 Like

I’m using venus.chub.ai. Is there a way to lower the numbers?

I’m not using any extension or anything like that.

Many third-party apps use a lot more tokens that you may expect. They may be summarizing past responses, extracting meta data, generating titles, whatever. Also if it’s a chat-type interface then you are charged for the tokens of the entire (or most recent parts) of history.

Best thing would be to ask the creators of that site.

My advise would be to cancel that API key if you are unhappy with the usage levels, you are free to use your key as you see fit so long as it is used within the Terms of Service.

With a 3rd party site it is impossible to determine where and for who your API key is being used and for what purpose.

Alright, thanks for the answers everyone. I appreciate it!

Looks like some kind of “character” chatbot type thing. The prompts for this will probably be quite large to give it the required personality.

The way chat works is that every time you send a message everything needs to be sent up again. Including the initial prompt and all the message history so far.

As has been pointed out, putting your API key into a random website does carry some risk. But looking at the history you posted it looks pretty normal for a chat bot with a large prompt.

Yeah, fair enough. I have a question, do I get taxed based on how many words the bot types aswell? I’m a little confused

Yes, in your history you can see things like:

1,918 prompt + 117 completion = 2035 tokens

1,918 prompt - this is basically the initial prompt + any previous questions from you and previous responses from the bot

117 completion - this is the new answer from the bot.

You are charged for both the input and the output of the bot.

I see. It makes sense now. I’m not overpaying anything lol. It’s a really good deal to be honest actually. It’s really fun and a good time killer and also pretty cheap.

3 Likes

Hello once again, I was wondering if there’s a way I could use my own computer to generate the tokens instead of that website so I can pay less. I saw they have reverse proxies on openai, could that be it?