Is there a way, tips for fixing the issue of 4o (probably other models do this as well) ignoring information in case of very long messages (15-20k tokens).
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Output length of gpt-4o and gpt-4.5 is far below expected for large input | 0 | 186 | March 6, 2025 | |
GPT-4o Context Length Issue: Input Tokens Within Limit but Exceeds Maximum | 3 | 461 | February 1, 2025 | |
Getting error 400 400 The input token count exceeds the maximum number of tokens allowed (1000000) | 1 | 324 | December 18, 2024 | |
ChatGPT 4o-2024-11-20 cuts 3/4 of the output (2024-08-06 provides all output) | 7 | 779 | January 3, 2025 | |
Gpt-4o often forgets to include code | 4 | 345 | June 18, 2024 |