Estimate token count using structured output

Hi, is it possible to estimate the number of input tokens used by a pydantic model? It would be useful to estimate the cost in advance before sending a request to the API. Thanks!

Hi @thomas246 !

You could estimate the “minimum cost” - basically by passing your JSON schema through tiktoken to get the number of tokens. The cost will never be less than this, and it gives you a starting point from which to reason about what will be total cost, i.e. JSON_SCHEMA + X. Your X here will be dependent on what is expected by your schema (e.g. if you have arrays, open-ended strings, etc). But you could probably come up with some “sample” values, pass those through tiktoken as well, and then you have a rough baseline to compare against. You can draw some standard error band around this number, and monitor your number of returned tokens, and flag if there is an outlier.

2 Likes