Preparing data to fine-tune function-calling model

The function method can be used to train AI models. It is required to be in a special assistant format because OpenAI doesn’t want to disclose the actual language the AI generates to the API’s tool recipient code (hint: and has disabled spaces in role names so you can’t quite get there).

The language that you train the AI on will be the same even if you plan on “tools”. The only thing not exposed is having the AI make multiple tools calls at once, which has AI write more with undisclosed prompting. You should be able to utilize the fine-tune model with either “function” or “tool” method.

gpt-3.5-turbo shouldn’t actually need fine-tune training on emitting functions, unless you have a very specific stylistic or behavioral choice that must be enclosed in them. OpenAI says in the guide that then you don’t need to include full function definitions then, but the API actually requires functions or tools definitions still be passed or you get nothing.

Note: fine-tune with function currently has a fault in that it trains the AI wrong on terminating normal output or such, resulting in repeating lines. More workaround is needed.

2 Likes