Fine-tuning for function_call?

I want my chat bot to call functions exclusively. It seems like it would make sense to fine-tune towards this but according to the docs, I can only supply q/a pairs. Can I do it by creating “questions” and “answers” that cohere to the protocol? Or is it not possible?

3 Likes

I am building something similar, and what I am trying first is to give a few examples at the beginning of the conversation, so GPT can follow. Under the hood, even if it is a chat interface, it is no different from a big text box like other models. Try that and see if it works for your use case.

Hi and welcome to the developer forum!

You can make the GPT models call functions exclusively by setting the function_call="your_function_name"

in the model API call

openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0613",
        messages=messages,
        functions=functions,
        function_call="your_function_name",  # auto is default, so we'll set it to out function name
    )

@Foxalabs thank you, I have that working already. I am just trying to figure out if it’s possible to refine this behavior using fine-tuning. I.e. “With a prompt like this, you should call this function with parameters like this…”, etc.

On the face of it, sure, that seems reasonable, I’ve not tried it, so you’d need to experiment with it, if the function call is influenced by the text in the prompt… it seems like that would be a thing that would at least have an effect.

According to recent updates in fine tuning model and its documentation, fine tuning function calling is currently unavailable for models. Here (OpenAI Platform), it states that:

" We do not currently support function calling examples but are working to enable this."

So we might see it soon in future, but right now we have to manage it through simple chat completion only.

1 Like

This is also a feature I have been looking to implement. After fine tuning my own 3.5-turbo model it appears I am unable to use functions while using the tuned model. Looks like they plan on adding it in the future. Hopefully soon.

it doesn’t appear with a fine tuned 3.5-turbo model that there is a functions argument. When I put my custom tuned 3.5-turbo model in the model section it said unrecognized parameter functions when I ran my code.

Yeah, the lack of this feature is a real dealbreaker for some use-cases :confused:

Is there a way to be notified when it’s finally released?

Fine-tuning function calls appears to be possible now: OpenAI Platform

1 Like

Hello guyz, Actually i got stuck in the Error : RateLimitError: 429 You exceeded your current quota, please check your plan and billing details. can anyone know about it. please reply me. i am using this first time i don’t even take a paid any money for the api. i am using free trail but it showing me your plan is excced and the usage quota. please feel free to reply. I am like your small brother :). i am creating a voice assistance web application like a chat-gpt.

Hey Champ, and welcome to the developer community forum.

You’ll need to purchase a subscription to the Chat API to continue using it after your trial ends. More about rate limits: OpenAI Platform

1 Like