When sending a message to OpenAI chat api does it add json special characters ex. "{" to the final amount of prompt_tokens?

Hello I’m sending this message: What is the most beautiful country?

I’m sending it as a json object {“role”:“user”,“content”:“What is the most beautiful country?”}

I thought it would return like 7 tokens for the prompt but it doesn’t.

It is returning 15 tokens for the prompt. Is that correct or it shouldn’t be returning that amount? Even when sending just a dot “.” as a message it is returning like 9 tokens for the prompt.

I’m using gpt-3.5-turbo.

1 Like

Welcome to the OpenAI community @KingOfPeru

  1. You can use max_tokens to limit the amount of tokens generated.

  2. You can set a system message specifying the kind of response you want. e.g. terse, short, concise. OR specify how many tokens you want. e.g. “Answer in 5 tokens” in the system message.

Refer to Chat Completion API docs

1 Like

@KingOfPeru
I hope my post is helpful.