Hyperparameters documentations - where are they?

Hi, can someone let me know what are the hyperparameters that we can control and how to set them? I only tried ‘n_epochs’, ‘batch_size’, and ‘learning_rate_multiplier’,‘classification_n_classes’ but they are not producing the results that I wanted. Are there hyperparameters to control the quality threshold like r2? or drop in mse? Currently the finetuning job ends prematurely even though I specified that I want 5 epochs.

The legacy fine-tunes endpoint and all models previously trained will be shut off in one month.

Here’s a link into the documentation, where you can read a few more old ones like compute_classification_metrics.

The replacement fine_tuning endpoint only exposes the ones you mention:

    "n_epochs":1,
    "batch_size":1,
    "learning_rate_multiplier": 0.9,

So if you are able to to use that “classification” parameter, you might be experimenting with making models good for one month (if not davinci-002 through that endpoint and if those stay).

Noted on the limited hyperparameters that I can play with. Just wondering if there are any ways that I can extend the learning cycles so that they don’t stop prematurely. Currently, it seems to always stop after a few steps, and it didn’t even complete 1 epoch.