Forcing Chat Completion to use Function Calling first before answering using its own knowledge


I have requirement to let our Chat respond with an exact answer from our knowledge base first before answering on its own. I am able achieve this using pure OpenAI completion using function calling.

What I did was define each Question as a function and OpenAI maps it whenever it thinks it needs a function to answer. So far it works.

However, it is not consistent since OpenAI decides if a function needs to be called or not. So, there are times that it answers with our pre-defined answer, sometimes it does not and uses its own knowledge. So it is not perfect. I am finding a solution to force it to use my knowledge base first before using OpenAI’s stock knowledge.

Some of my functions have external API calls to respond to questions so I am not able to try the pure embeddings approach. If embeddings + function calling can be combined, that would be ideal.

Here is a snippet of the configuration passed to the Completion API:

  const initialResponse = await{
    model: 'gpt-3.5-turbo-16k-0613',
    temperature: 0,
    stream: true,
    function_call: 'auto'

I’m open to ideas which I can try out since I am new to this also. :slight_smile: