Techinque of Hyperparameter Tuning

Hey,

Is it possible to perform hyperparameter tuning techniques like grid search, random search, Bayesian optimization, and Tree-structured Parzen Estimators (TPE) on a DaVinci model using a custom dataset?

Thanks

Hey to you!

The only place where the term hyperparameter is used is on OpenAI fine tuning.

Currently exposed hyperparameters can control the learning:

batch_size - Number of examples in each batch. A larger batch size means that model parameters are updated less frequently, but with lower variance.

learning_rate_multiplier - Scaling factor for the learning rate. A smaller learning rate may be useful to avoid overfitting.

n_epochs - The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.

The only base models to fine-tune are davinci-002 and smaller babbage-002. The custom dataset is many contexts of example input through to AI completion sequences.

OpenAI doesn’t disclose their fine-tuning adaptive parameters, optimizer, reward model, and reinforcement learning algorithms.