Hi guys, while chatGPT is free and there are multiple projects that leverages it, the API prices remains the same and using Davinci 003 is very costly which doesn’t make any sense. Are there any plans to reduce the gap between the two? Otherwise we should start see how to embed chatGPT instead of paying for the API. Thanks!
I don’t think unofficial use of ChatGPT will always be available, and going that route might end up with a lost account due to Terms of Service. I doubt ChatGPT will remain free forever.
Davinci dropped in price from $0.06/100 tokens to $0.02/1000 tokens, so I imagine the others models will drop in price as new ones emerge.
Hope this helps.
Maybe, but if the Microsot - OpenAI investment will happen, who knows? Whatsapp was supposed to cost money eventually, but Facebook decided to keep it free for good.
Any kind of chat-based product requires tons of tokens becouse of the need to keep the chat log within the prompt, making the use of Davinci unrealistic even in 0.02 cents per 1000 tokens.
In the same time it is available for free to everyone through chatGPT.
That’s kind of killing the motivation to work with the APIs.
Personally I like the business model.
If OpenAI charges for the API usage, then whoever builds an app on top of it must have a monetization method too.
If the API usage would be cheap, or even free, thousands of junk websites/apps would spread all over, making the internet chaotic.
On short, the business model makes the service usage more responsible.
Same thing. The API is kinda expensive, still. You’d think that 2 cents per 1000 tokens is much, but as soon as you’ll start playing around, you rack up $20 pretty quickly, which here where I live can buy a good, big pizza with 8 slices.
I understand that such a model is GPU intensive and requires a lot of horsepower, but the API should remain cheap until you request a review for a commercial application. $0.02/1k is not cheap.
In the era of social networks people are used with large quantities of low quality information because is free.
ChatGPT supplies existing information, but it spares the users of the junk part.
I spent 20 minutes today with ChatGPT and I chose it instead of Google Search.
Google can provide websites for you, but then tou have to dodge the ads and the meaningless content.
For some people it is worth spending cash and gain time.
I am talking about merely prototyping applications, not released products.
2 cents per 1k token just to play around gets expensive quickly.
That’s not really relevant. The potential usage of GPT-3.5 is far wider than online search (which willl also be handled).
However, currently chatGPT is disrupting the emerging ecosystems that was built on top of OpenAI’s APIs.
You are right. This could be handled somehow by OpenAI. There could be a sandbox for development purposes.
I spent $20 just to test the fine-tuning feature.
That’s true, but honestly it was predictable.
There are websites which generate articles for you… That is too simple to do. People can simply go to OpenAI playground and do the same.
Most certainly we will see more surprises in the next years.
Probably the most chances will have those apps which can add value to the generative AI.
I totally agree on the need to create unique products, but I think that providing a free B2C service while overcharging and then competing your partners wasn’t really predictable.
Here are some valuable lessons:
- Never depend on someone else’s goodwill (or at least have alternatives available).
- Interfacing generic technology is not a product, even if it’s very cool (e.g. text summarization or creative content using a language model); make your product unique for the long term.
- Keep an eye on the market; it is constantly changing.
- While new technologies are fantastic, they also present unique challenges. Always stay one step ahead.
- Put yourself in a lucky position and move FAST.
I think the media hype over ChatGPT was totally unexpected. And now that the hype train has left the station, OpenAI has to keep ChatGPT operational (which is likely causing all the recent outages). The news I read today is that they are trying to field a paid version called ChatGPT Pro. If this takes off maybe they can buy more servers.
The more common solution would be to add OAuth authentication for your OpenAI account. And that way, your OpenAI bill would be separated from whether or not the app or integration you’re using is free.
If we assume that it costs OpenAI $0.02 per query to ChatGPT (they said it costs a couple of cents), and we assume a million interactions per day… Which I think is likely on the lower end of things, then we’d be talking about $20K / day to run ChatGPT.
Although, given that it got a million users in the first 5 days, and an average session usually isn’t just one interaction, by this point they could have tens of millions of interactions per day.
ChatGPT is a research preview to gather data. It can’t really be compared to an actual product.
And as much as I want to build apps on top of ChatGPT myself also, the use of the unofficial API could be killed at any point - by making any kind of change at all to that API.
Even if OpenAI isn’t trying to kill the integrations (some of which are quite cool), they could end up killing them completely by accident. Because the API is only meant to be used internally by OpenAI, and thus there’s no support for it, nor guarantee that it will remain the same.
Do you mean , Isn’t much or is much?
ChatGPT is a research project, and all the data collected will be used to deliver better OpenAI solutions in the future. Some will likely be available on the API model - others might be integrated in solutions like GitHub, or stand-alone products.
Anyone developing apps on ChatGPT today is providing OpenAI with a view on what services will succeed or fail in the market - and maybe why.
As such, keeping ChatGPT free for awhile is a fantastic investment. Let others take the risk…
Is paying not even $10 for the equivalent of a 1000 page book expensive for you?
For every time you read a single page, yes.
$0.02 per 1000 tokens.
What is 1 token worth in terms of the output you get from ChatGPT? Does the 1 token equate to each prompt / search? Or is it more in-alignment with the amount of characters in a Chat GPT reply?
In those 1000 tokens the tokens of both, the prompt and the completion are considered. So, it costs for tokens in prompts and the completions.