How to see token per message for gpt models?

Sure, here’s a revised version of your statement:

“I attempted to create a token counter to tally the number of tokens in a prompt. I referred to the documentation at How to count tokens with Tiktoken | OpenAI Cookbook. Since tokenization varies with each model, I sought to determine the values of num_tokens_message. How can I retrieve this information?”