I’m currently working on a code that involves reading a substantial file serving as my Knowledge Base. The objective is to utilize GPT to provide answers to user questions based on this information. However, I have encountered a token limit error. I would like to inquire whether it is possible to purchase additional tokens to enable my program to process a larger volume of text without encountering the token limit error.
By token limit error, you mean the context window length of 8k right ?
If so, Sadly right now, only the 8k is available with a few people have access to the 32k one. There is no way to get the bigger one or but additonal token. However, there might be ways around this based on how big your document is and whether you are willing to preprocess it before it as a knowledge base