If you also think that this pricing little high please replay me and support me to contact GPT-4 support
Thank You
If you also think that this pricing little high please replay me and support me to contact GPT-4 support
Thank You
I personally don’t believe the cost is too high for GPT-4. I feel like they lower the prices all the time. Also, the preview turbo model not only has a higher context window of 128,000, but it is also 3x cheaper. I would recommend using GPT-4-Turbo if you are not already, as it will save you a lot of money if this is your problem.
You got to be careful with the context.
Sending 100,000 tokens for a short 2,000 token reply is easily 1$ per message.
And that can get expensive really fast.
‘Keep it short’ is a candidate to become one of the 10 commitments for developing with LLMs.
If Assistant costs $8/1M tokens, it takes 20-30 questions to reach at 1M tokens. This is crazy