Hello OpenAI Team,
I am writing to propose an enhancement for the ChatGPT API memory capacity. Currently, the token limit for the memory is capped at 8K tokens. As a user who works with longer conversations or ongoing projects, I believe increasing the memory limit to 24K tokens would significantly improve the user experience.
Here are some reasons I think this would be beneficial:
- Extended Context for Complex Conversations: Increasing the token limit would allow for better context retention during longer discussions or complex queries, helping users maintain continuity in their interactions.
- Improved Productivity for Long-Term Projects: For users who use ChatGPT to write stories, conduct research, or manage multi-step workflows, having more tokens available would ensure that we don’t lose important context during long interactions.
- Better Handling of Multiple Messages: With more tokens, ChatGPT would be able to retain a more comprehensive history of past exchanges, resulting in more accurate and relevant responses over time.
I understand that increasing the token limit may involve trade-offs in terms of performance and resource usage, but I believe that this change would enhance the utility of ChatGPT, especially for users who rely on the platform for in-depth or ongoing tasks.
Thank you for considering my suggestion, and I look forward to your feedback!