I want the AI to return the best answer from a list of pre-made answers stored in a .txt file. Thus, I am using file_search to give the AI this file. The results are great but it seems the .txt file is being providing in each API call which requires a lot of tokens.
Is there any way for the AI to simply reference the file that is already uploaded to OpenAI? Rather than feeding it into the input during every API call? Or any other alternatives I didn’t consider?
I’m not really sure what this means. You have a document that contains the answers for questions? Why not use Function Calling for this?
How else is the model going to know? Although the low-level details aren’t available it’s (IMO) safe to assume that they will either return the full document on each query, or they will return the top results to determine if they are relevant to the query.
So if you are asking questions, and have a document with answers it makes sense that the model needs to parse through a lot of it to determine if the answer is there.
This is what they are aiming for. Results over costs. Can you probably optimize the costs? For sure. This requires some additional insights into your system.