Sometimes it is necessary to verify in advance whether a given text will fit within the model’s token limit. This helps in making informed decisions, such as whether to split the text into smaller segments or apply other necessary adjustments to ensure compatibility with the model’s requirements.
Hi @gedean.dias
You can use tiktoken to count tokens before you make an API call.
1 Like
Thank you for the suggestion. However, my current stack does not include Python, and I would prefer a solution that can be integrated directly into our existing infrastructure. Could you please provide an alternative or suggest an approach that doesn’t rely on Python
You can use this community built tokenizer based on the original tiktoken:
GitHub - dqbd/tiktoken: JS port and JS/WASM bindings for openai/tiktoken
1 Like