No Token limit per fine tuning job? It was 50 million before

No Token limit per fine tuning job? It was 50 million before.

It seems the documentation was changed. I guess you have to find out for yourself if a $400 fine-tune job will go through successfully.

Fine tune of a model can also now be continued, adding to the weights or diversity, so you also have that way to continue the effective epochs of the training set.

2 Likes

I think the fact you can continue previous fine-tuned models implies any training token limit is academic, since you could just train in 50-million token (or whatever the limit is) batches and consider them “checkpoints.”

In fact, aside from the extra effort required to do so, I can imagine it might make sense under some circumstances to only train an epoch at a time.

1 Like