It's depressing that the only option to replace fine tuned Babbage is 3x more expensive

Having created some actually useful AI agents, we found we needed to run multiple prompts behind the scene and stitch them together to create a useful answer.

If you layer even a handful of GPT3.5 and GPT4 prompts together you can quickly end up in a world where it’s cheaper to have a person do it. I was hoping withe the November priced adjustments that would solve this but given the cost impact, we’ll likely have to go the open source route and try out some hugging face models but it’s just annoying when we’d like to stay focused on building the product.

Anyway my 2c, I get that we might be a niche user

Your topic doesn’t align with the contents.

It would appear to me that fine-tune hasn’t changed much in pricing; I’d expect similar quality when paying similar price even though 4 models → 2 models.

If new cheaper babbage-002 can’t replace old babbage, you do get a big cost increase to jump to davinci-002…

Fine-Tune model; Price-per-million

small models:

Legacy model Training price Usage price Recommended replacement
ada $0.4 / 1M tokens $1.6 / 1, tokens babbage-002
babbage $0.6 / 1M tokens $2.4 / 1M tokens babbage-002
Replacement Training Input usage Output usage
babbage-002 $0.4 / 1M tokens $1.6 / 1M tokens $1.6 / 1M token

bigger model:

Legacy model Training price Usage price Recommended replacement
curie $3 / 1M tokens $12 / 1M tokens davinci-002
Model Training Input usage Output usage
davinci-002 $6 / 1M tokens $12 / 1M tokens $12 / 1M tokens

If you had fine-tuned within a certain window before the deprecation announcement, you may have had a significant chunk of past fine-tune credited.

But you are correct, an untrained small model is not a scarce resource any more, and you can tune by the hour.

Ah if fine tuned Babbage-002 is sticking around, then I got the wrong end of the stick. I keep getting emails saying models we use are going to be deprecated and I thought Babbage-002 fine tuned was the only candidate for that. I guess that means that some other model somewhere is using a different model.

I might delete this post as it seems it might be misleading. Thank you

Actually this is probably a good post to leave up as others might make the same or similar mistakes with the upcoming deprecations and this basically says that one should not jump to conclusions first but a little fact finding may go a long way.


Care if I close this topic so that others can not jump on.

Users really should start new questions with new topics and this topic could easily become the dumping ground for this and Logan prefers not to have megathreads so that it easier to deal with problems.

1 Like

Do what you like, I don’t seem to have a delete option anyway

Thanks for responding.

Only moderators and above have the close or delete option.