Unexpected Vision Pricing

I’m expecting a low detail image to cost 85 tokens, see this documentation.

However, I’m being charged about 1000 tokens per image. What am I doing that is causing this high price?

  const messages = [
      role: "system",
      content: `Given a list of images, return a json that fills in the below json example template  that describe the photos given.
          {"rule": "brown_hair", "description": "This person has brown hair", "result": "no"},
      role: "user",
      content: [
          type: "text",
          text: "Process the following images:",
          .map((url) => [
              type: "image_url",
              image_url: {
                url: url,
                detail: "low",
  ] as ChatCompletionMessageParam[];


  const requestBody = {
    model: "gpt-4-turbo",
    response_format: {type: "json_object"},
    max_tokens: maxTokens,

  const completion = await openai.chat.completions.create(

I would capture the usage out of your return object from the API, to make sure you aren’t confusing the input tokens with the total tokens.

You can also use URLs for images under 512x512 vs one with the smallest side greater than 768 (just upload somewhere like imgur yourself), and see if the token consumption actually changes like high detail would.

The consumption is of course multiplied by the image count.