I am trying to use the new Fine Tuning feature for GPT3.5 and would love to set my own epoch, batch_size, and learning rate. I see that the FineTuningJob.create() has an object params** but I am unable to set the number of epochs, etc. within the function without and error that reads ```
invalid batch_size: 128, extra fields not permitted
sps
2
Welcome @mcconnell340
Hyperparams can be specified using the hyperparameters json map:
openai.FineTuningJob.create([...], hyperparameters={"n_epochs":value, })
2 Likes
Thank you so much! I was inputting as hyperparameters = {n_epochs =1}. Understanding the syntax helps 
2 Likes
The following gives the error “invalid batch_size: 128, extra fields not permitted” – can I only set the hyperparameters with n_epochs at this time or is my syntax still incorrect?
openai.FineTuningJob.create(training_file=“file-JNKdiQH3wl60mnzrdbtJeLTQ”, model=“gpt-3.5-turbo”, hyperparameters={“n_epochs”:1, “batch_size”:128, “learning_rate_multiplier”: .05, “classification_n_classes”: 194})
sps
5
Currently n_epochs is the only hyperparam that can be specified.
2 Likes