This is great advice. I was playing around with fine-tuning as well for this. If you limit the max tokens to 1 though isn’t that still a multi-class instead of a multi-label model? Even with fine-tuning I’ve been struggling to get the model to spit out the correct set of labels using the TRAINING data… sigh.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Please outline the steps to perform multi-label classification using GPT3 | 5 | 242 | November 13, 2024 | |
Issues with Fine-Tuned Babbage-002 Model Returning Incorrect Completions | 13 | 1806 | December 21, 2023 | |
Passing instructions only once for a message classifier use case | 14 | 720 | April 20, 2024 | |
Using the new fine-tunes endpoint for binary classification | 10 | 2189 | January 11, 2024 | |
Fine Tuning multi label classification dataset | 2 | 1895 | May 19, 2023 |