- how to increase epoch data set limit if increase the data set above 10 i got error
- how to increase epoch data set limit if increase the data set above 10 i got error
Perhaps there is a misunderstanding.
10 is the minimum number of training examples required in a fine-tuning file. Below that is blocked because it would simply be pointless and demonstrate a misunderstanding of the technology.
The actual number of examples that you will need to create for supervised fine-tuning learning is going to be much higher than that if you wish to instill quality to responses or behavior for a specialized yet broad and informed task.
if increase data set limit above 10 i got error in fine tune model please explain
You are passing the first test - and then failing a second. It is not likely to be about the length of the training file.
n_epochs
is NOT a control about the size of the file. It is a hyperparameter for how many times the entire file is repeated for training, increasing the expense. You should set it to ‘2’ for your first experiment. Or ignore the term and let it be “auto”.
You can read the error message of the API call or the platform site web UI and pay close attention to what is actually being reported.
- only select AI models are available for fine tuning
- this can be account specific
- you need an account with enough prepaid funding to pay for the job.
- you need to use code that authenticates properly, with a well formed file where each line is a JSON that validates.
Are you using the platform site user interface? That can help show your access levels: https://platform.openai.com/finetune