If you are discussing providing a service to users: Tier 4, giving two million tokens per minute of rate limit, is having paid OpenAI $250 total and 14+ days since first successful payment at the time of making that latest prepayment. Consider that OpenAI can have a single ChatGPT user paying them $200 per month, to put things in perspective.
But you are correct, using a retrieval database or search service to provide only what is relevant to the AI based on the user input is far more cost-efficient than “read this book every time, just to possibly answer one question from it”.
Semantic search doesn’t work well on raw data such as Excel statistics turned into CSV, so you might need more advanced tools to provide the AI the ability to write its own queries.