Hi
I’ve read about creating your own tool but so far, it appears that all the resources I have found have built tools that do not interact with a chat model, but doing something else.
Suppose my task needs to prompt a chat model N times and each time asking it to do something very specific. I am thinking of writing one tool for each of the N asks. This way, I can collect them as functions pass them to the model like how you use an agent. But I don’t get how I pass the ‘chat model’ to the tool itself. Have I misunderstood the purpose of tools entirely?
For example I am writing a tool that asks a chat model when is the next special day (e.g., thanks giving) then unpack the answer and return structured values. My code is written with Langchain, but it resembles the basic flow of the openai api I suppose.
What do I do with the next_special_day function?
from pydantic import BaseModel, Field
from langchain.agents import tool
class SpecialDayFinderInput(BaseModel):
'''
To represent the expected input to the tool specialdayfinder
'''
special_day_name: str = Field(description="The name of the special day we want to find about")
class SpecialDayFinderOutput(BaseModel):
day_name: str = Field(description="The day in the week, such as Monday, Tuesday, Sunday etc.")
dd: int = Field(description="The numeric day in a month, e.g., 1, 9, 30")
mm: int = Field(description="The numeric month in a year, e.g., 1, 9, 12")
yyyy: int = Field(description="The numeric year, e.g., 2002, 1991, 231")
@tool(args_schema=SpecialDayFinderInput)
def next_special_day(special_day_name: str) -> [str, int, int, int]:
"""
Prompt an LLM with the special_day_name param and process the response, return the result
"""
return []