Can I assume that it is using GPT4o’s tokenizer? Because I kind of use tiktoken to roughly estimate my token count and based on what I see, it is slightly underestimating it. Unsure if this is because GPT4.1’s tokenizer is different from GPT4o’s or if that’s normal.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
What's the tokenization algorithm GPT-4.1 uses? | 2 | 595 | April 29, 2025 | |
What is difference between GPT2 and GPT3 tokenizers? | 1 | 1788 | February 21, 2024 | |
How does gpt-4o-mini-tts calculate tokens? | 1 | 79 | May 13, 2025 | |
Tokenizer used by Assistant API for chunking not correct! | 3 | 119 | November 2, 2024 | |
Struggling to get correct token count | 2 | 1899 | September 4, 2023 |