I built a solution that takes HR manuals and company directories and allows users to quickly get contact info for people in their company. That worked great. Now, when I give it the same prompt as before, the system refuses to provide the answer. I understand the importance of being cautious but this is going to far. There should be a way for data in my small language model to be displayed.
Has anyone else had this same issue? Have you resolved it?
The model is definitely becoming more sensitive, no disagreements there.
In your case though this seems to be a really simple lookup task. Using a LLM to perform this in my opinion is the equivalent of using a 16-wheeler truck to carry a TV. Huge waste of gas
In the code implementation you can use function calling to implement the search function.
Then you’ll have to respond to the tool_call with a successive API call containing the function call result, and the model with reply to the user with the details they requested.