Feature Request - See current window usage

Hello! As I am using ChatGPT & GPT4 for projects I have found myself worrying about the model forgetting previously stated information, I think that a visual indicatior or warning of some sort could be really beneficial to inform the user that the model is starting to forget what was stated earlier.
So two features:

  • See current window use; 5000/8000 tokens in window
  • If above max window size, show visually in the browser where the cut off point is so that it can be played around in an effective manner.

Hi @kimG,

one way to get this information would be to use the API, there you get the token usage in the Reply.

As you can see in the screenshot you’ll receive a detailed overview regarding token usage and also the stop reason, so if you run out of tokens I believe you’ll also get the information you’re looking for in the stop reason

Thank you for your answer!
Am I wrong to assume that this is a API only feature, and that when one is using the Browser chat version you can not get this information?
Or is it possible to get the ID of the current chat session and see this?

1 Like

Hi @kimG, your’e right, this is only for the API, if you want to see this in the Chat window i however have a idea for you :wink:

! But this is only a idea and does not match the tokenizer ( OpenAI API) - so maybe just tell it to count the number of characters and then divide by the average token usage.

Another Option would be to look in the Fetch URL in the developer tools in your browser - there it might state the number of tokens also somewhere… But I’m not sure on this one

1 Like

Oh I thought you’re looking for API answers because you posted in the “General API discussion”-Thread, if this is regarding Chat GPT with GUI maybe the “Prompt Assistance” or the “ChatGPT”-Category might lead to better results on your inquery :innocent:

1 Like