Hey odb23 welcome to the forum. This thread may be helpful for you Ways to automate breaking a large piece of input into chunks that fit in the 4096 token constraint?
lachie1
2
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Token Limitization Error when prompting | 8 | 3748 | December 6, 2023 | |
| How to Handle Token Limit Exceeded Error in OpenAI API | 0 | 1031 | December 30, 2024 | |
| Help Needed: Tackling Context Length Limits in OpenAI Models | 8 | 20842 | February 8, 2024 | |
| Gpt-4-1106-preview: 400 This model's maximum context length is 4097 tokens | 8 | 5898 | March 18, 2024 | |
| Error: This model's maximum context length is X tokens | 5 | 16832 | September 1, 2023 |