How to calculate the tokens when using function call

The text is transformed by OpenAI’s internal “function” endpoint after the model selector, load distributor, accounting server paths, other undocumented routing internals. Not any public code.

Here I extract and show the byte string transmitted by the python API module, including a function: Typo in OpenAI API Documentation - #2 by _j

The input is validated against a standard JSON schema, and rejected for malformed requests (“bool” or “float” instead of “boolean” or “number”), but no warning is given when the programmer’s intentions are damaged by the rewriting (omitting keywords like “example”, “maximum”, or discarding any nested descriptions)

Passing a function even selects a differently-trained model than without.