i have a chat bot for which uses langchain and embeddings to answer the user questions , now after the release of function calling feature in gpt-3.5-turbo-0613 , i want enhance the chatbot for asking followup questions using function calling capability.
so now i want to make a separate api call to decide which function to be used for the user query , the already implemented logic which uses langchain and embeddings , will be passed as a function along with other functions and i want it to return only function that can be used and not to answer the user questions.
is it possible to acheive ? it should only ask follow up question to get the required paramer values from customer and return a function .
It sounds possible, but you may need to move away from LangChain or use less of it. LangChain abstracts a lot of the API calls behind many layers, so it’s easy to use, but harder to customize. I’d suggest running LangChain in debug mode so you can see the prompts it is making, and start making your own API calls directly in your code with your desired functions/embeddings. You could also try asking on LangChain’s forums, maybe they’ve added more support around functions.
I have tried my calling an openai api directly , but the issue is it is providing the correct function, assuming dummy values for required parameter’s of the functions.
Same for me ! I even added to prompt to ask follow up questions but doesn’t work. Also, I am curious how are you handling it when there are so many API calls? Like choosing from over 100 APIs with complex responses?