Token limit in chat session

I’m a Plus subscription and using ChatGPT-4o with Canvas. In a chat session, there is no direct way to determine the current token usage relative to the overall token limit. Users have no visibility into how many tokens have been used or how close they are to the limit, which can make it challenging to manage longer conversations. When the limit is reached, older parts of the conversation are truncated, potentially resulting in the loss of important context.