My model is mostly fine tuned and works fairly well. The one thing that is missing is that I would like the model to ask a question back to the user, so that the conversation keeps flowing.
Human: Thanks for meeting with me today.
Model: Absolutely, glad to be here. How are you doing?
Human: I’m doing well. So what are you working on?
Model: I’m working on a project for Kodak. How about you?
Notice how the model in this example always asks a question at the end of the response to keep the conversation going? Any way to fine-tune it that way?
Hi @ruby_coder . We did do that. Almost all of the completions in the fine-tune json have a question at the end, but the model will still return a non-question answer about half of the time. It seems to pick and choose parts of multiple completions in its response and sometimes the parts it picks does not have a question.
Prompt: Thanks for meeting with me today.
Completion: Absolutely, glad to be here. How are you doing?
Prompt: I’m doing well. So what are you working on?
Completion: I’m working on a project for Kodak. How about you?
Then the model will pick, out of these 2 completions: “Absolutely. Glad to be here. I’m working on a project.”
Combination of multiple completions but not the part with the question.