masa_u
August 28, 2023, 5:21am
1
I’m trying to replace deprecated fine-tuned models with new ones based on the model babbage-002
using new fine-tuning OpenAI API.
Options and hyperparameters such as -v
, --n_epochs
, --compute_classification_metrics
upon executing fine-tuning
are not found on new fine-tuning page.
Explore developer resources, tutorials, API docs, and dynamic examples to get the most out of OpenAI's platform.
I want to use parameters below
openai api fine_tunes.create -t ./data/train_v4_prepared_train.jsonl \
-v ./data/train_v4_prepared_valid.jsonl \
-m ada \
--suffix v4 \
--n_epochs 20 \
--compute_classification_metrics \
--classification_n_classes 2 \
--classification_positive_class " YES"
in next new version command.
openai.FineTuningJob.create(
training_file="./data/train_v4_prepared_train.jsonl",
model="babbage-002"
)
And how can I set validation data in the new fine-tuning command?
Request body
training_file
string
Required
The ID of an uploaded file that contains training data.
See upload file for how to upload a file.
Your dataset must be formatted as a JSONL file. Additionally, you must upload your file with the purpose fine-tune
.
See the fine-tuning guide for more details.
validation_file
string or null
Optional
The ID of an uploaded file that contains validation data.
If you provide this file, the data is used to generate validation metrics periodically during fine-tuning. These metrics can be viewed in the fine-tuning results file. The same data should not be present in both train and validation files.
Your dataset must be formatted as a JSONL file. You must upload your file with the purpose fine-tune
.
See the fine-tuning guide for more details.
model
string
Required
The name of the model to fine-tune. You can select one of the supported models .
hyperparameters
object
Optional
The hyperparameters used for the fine-tuning job.
n_epochs
string or integer
Optional
Defaults to auto
The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.
suffix
string or null
Optional
Defaults to null
A string of up to 40 characters that will be added to your fine-tuned model name.
For example, a suffix
of “custom-model-name” would produce a model name like ft:gpt-3.5-turbo:openai:custom-model-name:7p4lURel
.
https://platform.openai.com/docs/api-reference/fine-tuning/create#fine-tuning/create-suffix
Here you go… The parameters are still there…you just need to pass them…
openai.FineTune.create(
model_engine=model_engine,
n_epochs=n_epochs,
batch_size=batch_size,
learning_rate=learning_rate,
max_tokens=max_tokens,
training_file=os.path.abspath(training_file),
validation_file=os.path.abspath(validation_file),
)
1 Like
I believe this only works for the legacy/deprecated fine-tuned models.
To fine-tune using the new babbage-002. We have to use the new OpenAI SDK as below
openai.FineTuningJob.create(training_file="file-abc123", model="babbage-002")
I’ve tried passing n_epochs through this new OpenAI SDK but always have the error: InvalidRequestError: invalid n_epochs: 4, extra fields not permitted
Anyone knows how to add custom n_epochs and other hyper-parameters such as batch_as and learning_rate, please let me know. It would be much appreciated.
Hi Masa,
Have you found a way to add the n_epochs when start creating a fine-tuning job?
_j
September 5, 2023, 4:19am
5
That microscope icon top right? It will let you put in search terms, such as “gpt-3.5-turbo epochs hyperparameters”
Welcome @mcconnell340
Hyperparams can be specified using the hyperparameters json map:
openai.FineTuningJob.create([...], hyperparameters={"n_epochs":value, })
n_epochs
is the only one supported by the new endpoint.
1 Like
Thanks. I was able to add in the n_epochs.