Are fine-tuned models a good way to give GPT a specific tone of voice?

Fine-tuning can indeed be a good way to give GPT a specific tone of voice. By fine-tuning the model on examples of your own writing, you can train it to generate text that aligns more closely with your desired style and tone.

However, currently only the text completion models are fine-tunable (it’s been suggested the newer chat-completion models will be fine-tunable soon), so you may want to hold off on the fine-tuning those older models and instead work on refining a “like a drunken sailor” prompt. A few paragraph voice guide can go a long way towards generating text that sounds more like you.

5 Likes