Unable to configure hyper params using FineTuningJob() API in Fine Tuning of GPT 3.5

We are doing fine tuning of a custom model built on gpt3.5 turbo model. Using the cookbook notebook for fine tuning. Unable to use the below hyper parameters
batch_size, n_epochs, learning_rate_multiplier

Could you please update if the above parameters are configurable in gpt3.5 fine tuning using “openai.FineTuningJob.create()” as we get error if above params are included.

However, while referring datacamp link on "fine-tuning-gpt-3-using-the-open-ai-api-and-python’ topic , they were able to configure hyper params using openai.FineTune.create(), which I believe is valid only for chatgpt3 and not for chatgpt3.5

Kindly ratify our observation and let us know how to configure the params in gpt3.5
Will appreciate a sample code reference for API and curl command usage.

Welcome to the OpenAI dev forum @krkothan

When fine-tuning came out for gpt-3.5-turbo, only n_epochs could be configured.

Here’s the python code for this:

from openai import OpenAI
client = OpenAI()

client.fine_tuning.jobs.create(
  training_file="file-abc123",
  model="gpt-3.5-turbo",
  hyperparameters={
    "n_epochs":2
  }
)

curl:

curl https://api.openai.com/v1/fine_tuning/jobs \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "training_file": "file-abc123",
    "model": "gpt-3.5-turbo",
    "hyperparameters": {
      "n_epochs": 2
    }
  }'

Hello sps,

Thanks a lot for quick update, code and confirmation on only n_epochs hyperparam support in gpt3.5.
Appreciate if you could share any Open AI doc reference in this context.

In that case, what are the other fine tuning params which can be tried?
Any relevant inputs will help us move further.

Regards
Krithiga