I am in the process of implementing the parsing of a structured response as per the guide on Langchain JS (I am not allowed to link to it) and it is working nicely when I specify the model to be “gpt-3.5-turbo-0613”. However, the standard model has some trouble with our specific use case which is fairly domain-specific. Therefore, we have created a fine-tuned model, which produces better results in the Playground.
When I came to specify the newly fine-tuned model, I got the following error:
error: Unrecognized request argument supplied: functions
From some reading, it seems some of the older models do not support functions (this might only be true for Azure) but 0613 does support functions. I use “gpt-3.5-turbo-0613” as the base model for the fine-tuned model. Is there something else I need to do to enable functions when using a fine-tuned model or is it the case they don’t yet support functions.
OpenAI client version: 4.4.0
Langchain version: 0.0.148
Cheers for any help in advance