"Unrecognized request argument supplied: functions" when used on fine-tuned 3.5 model

Hi

I am in the process of implementing the parsing of a structured response as per the guide on Langchain JS (I am not allowed to link to it) and it is working nicely when I specify the model to be “gpt-3.5-turbo-0613”. However, the standard model has some trouble with our specific use case which is fairly domain-specific. Therefore, we have created a fine-tuned model, which produces better results in the Playground.

When I came to specify the newly fine-tuned model, I got the following error:

error: Unrecognized request argument supplied: functions

From some reading, it seems some of the older models do not support functions (this might only be true for Azure) but 0613 does support functions. I use “gpt-3.5-turbo-0613” as the base model for the fine-tuned model. Is there something else I need to do to enable functions when using a fine-tuned model or is it the case they don’t yet support functions.

Spec:
Environment: Node
OpenAI client version: 4.4.0
Langchain version: 0.0.148

Cheers for any help in advance
Will

Hi

I have found the answer myself in the OpenAI documentation. Currently, a fine-tuned gpt-3.5 model does not support functions. I opted to create a similar result by requesting a JSON schema is met in the system message, validating with Zod and then replying in the conversation with the validation error if validation fails.

Cheers
Will

1 Like