paul86
1
Hello, it is unclear to me what happens if I upload a file with e.g. 100,000 words for specialist knowledge. What exactly do I have to pay for? Also for reading the tokens that correspond to the 100,000 words?
You will have to pay $0.20 per day per gigabyte of stored information or part thereof, plus the API token count used by the retrieval system as context.
1 Like
_j
3
Let’s say that you don’t attach any functions, code interpreter, and ask in a new thread. Minimize the autonomy that the agent backend has to iterate on you. The API documentation basically gives a promise that the model context will be filled with retrieval (and no max_token to set).
How about 100k+ tokens of context loaded up for gpt-4-turbo = $1.00? Your 100000 words = around 150k tokens to be embedded and chunked back into the model context.
You’ll only get a delayed daily report of the tokens used in total for the day - by design, I’m sure, with the obfuscated billing usage coming out in concert with assistants.
1 Like
I’m rather upset at the fact that they removed the daily breakdown.
It would be extremely helpful to see what I used and when I used it. Instead we get an aesthetically pleasing but functionally useless line graph.
1 Like