Token Limit Awareness in ChatGPT Conversations

Wrote with GPT because my english is kind bad:

One major concern is that users might assume GPT “knows” the entire context of the conversation, while, in reality, some previous information might have been cut off due to exceeding the token limit. This could lead to misunderstandings, as users might refer to past interactions that GPT cannot access, resulting in incomplete or even inaccurate responses. Being aware of the token limit can help mitigate this risk, ensuring a smoother and more coherent exchange of information with the AI.

Suggestion:

Informing Users about the Token Limit : When initiating a conversation with ChatGPT, it would be helpful if the AI could provide a brief message informing users about the token limit. This could be displayed at the beginning of the chat or as a small indicator alongside the text input area. This way, users would be more mindful of the constraint and could structure their queries accordingly.

Token Limit Awareness within the Chat Interface : Additionally, it would be beneficial if ChatGPT could detect when a conversation is approaching the token limit. For example, when the conversation nears 80%, 90%, or 95% of the token limit, ChatGPT could display a gentle warning or highlight the text area to indicate the proximity to the limit. This visual cue would enable users to adjust their questions or responses proactively and avoid unexpected truncations.

Thanks, Victor. OpenAI hire me please

1 Like