Is Iterating on hyperparameters require new dataset?

I found this in the documentation:

my questions are:
1- Is Iterating on hyperparameters mean to train the trained mode (resulted from the first fine-tuning)?
2- should I upload new dataset examples in the iterating on epochs or I have to use the same dataset?

This is fancy language for “if our settings don’t work, start again pay for another full pricey experiment”.

With model fine-tune continuation now available, you can use a conservative number of epochs (like 3 if you have a quality set of hundreds or more), and then you can run more single epoch reinforcements to generate new models if you see that the learning and validation is still progressing instead of going backwards.

If you have more quality examples, I wouldn’t save them for later, I would shuffle them into the mix, and then randomly pull out 10% of the examples as a validation set (or more if if that results in under 20 or so)

Sorry that my English is weak @_j
You mean that I have two options:

  • Start a new fine-tuning with a different epochs.
  • Fine-tune the prefine-tuned model with a different epochs ((this will result new model)).
    and for both options there is no need for new dataset, I can use the same first dataset.
    Am I understand you right?

You are correct in your assessment.

The only thing not determined is if 3 + 3 = 6 as far as running a second fine-tune on the same model, or if there is a different magnitude of effect, what direction it would be.

An epoch is simply the ML algorithm being run on another pass of your data. There may be alterations done to other learning rate hyperparameters by changing epoch setting, though, now that OpenAI has tried to make more things auto-configured.