I want my chat bot to call functions exclusively. It seems like it would make sense to fine-tune towards this but according to the docs, I can only supply q/a pairs. Can I do it by creating “questions” and “answers” that cohere to the protocol? Or is it not possible?
I am building something similar, and what I am trying first is to give a few examples at the beginning of the conversation, so GPT can follow. Under the hood, even if it is a chat interface, it is no different from a big text box like other models. Try that and see if it works for your use case.
Hi and welcome to the developer forum!
You can make the GPT models call functions exclusively by setting the
in the model API call
openai.ChatCompletion.create( model="gpt-3.5-turbo-0613", messages=messages, functions=functions, function_call="your_function_name", # auto is default, so we'll set it to out function name )
@Foxabilo thank you, I have that working already. I am just trying to figure out if it’s possible to refine this behavior using fine-tuning. I.e. “With a prompt like this, you should call this function with parameters like this…”, etc.
On the face of it, sure, that seems reasonable, I’ve not tried it, so you’d need to experiment with it, if the function call is influenced by the text in the prompt… it seems like that would be a thing that would at least have an effect.
According to recent updates in fine tuning model and its documentation, fine tuning function calling is currently unavailable for models. Here (OpenAI Platform), it states that:
" We do not currently support function calling examples but are working to enable this."
So we might see it soon in future, but right now we have to manage it through simple chat completion only.
This is also a feature I have been looking to implement. After fine tuning my own 3.5-turbo model it appears I am unable to use functions while using the tuned model. Looks like they plan on adding it in the future. Hopefully soon.
it doesn’t appear with a fine tuned 3.5-turbo model that there is a functions argument. When I put my custom tuned 3.5-turbo model in the model section it said unrecognized parameter functions when I ran my code.
Yeah, the lack of this feature is a real dealbreaker for some use-cases
Is there a way to be notified when it’s finally released?