Does anyone have experience fine-tuning gpt 4o mini with other languages? I am trying to fine tune it using Korean dataset but it behaves very weirdly. For example, it returns mixed language characters I have never seen or it keeps repeating the same sentence until maximum token limit is reached. Does anyone else have experience in this? What could be the cause?
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Teaching Gpt4 a new language (Yiddish) | 0 | 381 | January 31, 2024 | |
Fine-tuning GPT to learn a new coding language | 3 | 3089 | December 24, 2023 | |
What the theory of GPT Finetune? The result looks not so good | 4 | 767 | October 23, 2023 | |
Can chatgpt be fluent in a specific language? | 1 | 573 | February 27, 2024 | |
Is it possible to fine tune GPT4o Mini to produce better quality tabular data? | 0 | 78 | July 26, 2024 |