Token Counter / Splitter?

Does anybody know of a tool that allows you to paste in large amount of text and it tells you how many tokens it is? I find myself having to resort to trial and error when breaking up longer documents I need summarized, critiqued etc.

Would be great if you were able to give it large documents and it breaks it up automatically in to 4096 token segments. Seems like this would exist but have not had any luck googling.

Appreciate it!


I use tiktoken.

openai/tiktoken: tiktoken is a fast BPE tokeniser for use with OpenAI’s models.

1 Like

I prefer this one as it better counts the hidden tokens.

1 Like