I am trying to use the new Fine Tuning feature for GPT3.5 and would love to set my own epoch, batch_size, and learning rate. I see that the FineTuningJob.create() has an object params** but I am unable to set the number of epochs, etc. within the function without and error that reads ```
invalid batch_size: 128, extra fields not permitted
The following gives the error “invalid batch_size: 128, extra fields not permitted” – can I only set the hyperparameters with n_epochs at this time or is my syntax still incorrect?