I have a use-case where I have a database of books and 10 words tagged per book. I put together a demo where I, per api request to GPT4-Turbo:
Step 1: Upload JSON-ified version of entire SQL db of 300 books and their 10 tags each.
Step 2: Ask ChatGPT for a JSON style response of top 3 similar books.
Now as you can imagine, the token cost for this is HUGE!! After about 10 requests I run out of my daily token allowance.
If I upload the file, would I be able to save on tokens? Any advice on how to make this a reliable tool that can be used more than 10 times per day?
Thanks in advance!