after finetuneing job my trained dataset tokens count increased . what is reason behind.
if epochs is the reason we put the default epoch count as 3, so token count should be multiplier of 3 like 680,107*3 but the count is 1274154 nearly doubled the count.
Hi there!
Sorry if I am misunderstanding the problem. But it sounds like you have 680,107 tokens in your training file and set the number of epochs to 3. With that, I arrive at 2,040,321 trained tokens in total, which is higher than 1,274,154.
Do you mind clarifying?
Thank you yaar for your quik response. my question is :before finetune on openai platform my dataset count is 6 laks on opneai tokenizaer tool, after finetuneing on openai plaform token count increased to 1274154.
if epochs is the reason to increase the token count after finetueing job creation.
we put the default epoch count as 3, so token count should be multiplier of 3 like 680,107*3 but the count is 1274154 . what is the main reason to increase the token count