I’m adding AI-assistance bot features to a social networking platform and want to allow users to augment the AI with a custom knowledge base. I experimented with the function calling feature and it works pretty well as the AI can infer which function to call to get the information it needs (so for blocks of information I add a function with a description that says it returns that information). But I since learned about embeddings and wonder if this is perhaps the way to go and I wonder if embeddings is the more scalable solution as the knowledge base, and hence number of functions will grow, or is the case that function calling replaces the need for embeddings?
I’m currently testing embedding using chromdb and langchain it works fine until now
every time i send request to API the db store the user text and bot response
the problem with embedding is it’s not smart system every time you query question it search for similair text unlike llm models “chatgpt” who has sort of “cognitive thinking”.
i tried to implement my simple logic of memory but it doesn’t work
# recursive load history conversations ''memory chunks" until the model answer the user question
def memorize(message, iterator):
if iterator == -1:
return "i don't remember"
# memory chunk
memory[iterator].append({"role": "user", "content":message})
print(f"num tokens={num_tokens_from_messages(memory[iterator])}")
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=memory[iterator],
max_tokens=100,
functions=pass_function,
function_call="auto",
)
response_message = response["choices"][0]["message"]
print(memory[iterator])
# delete the question for future questions
del memory[iterator][-1]
# Step 2: check if GPT wanted to call a function
if response_message.get("function_call"):
# Step 3: call the function
# Note: the JSON response may not always be valid; be sure to handle errors
available_functions = {
"not_exists": memorize,
} # only one function in this example, but you can have multiple
function_name = response_message["function_call"]["name"]
fuction_to_call = available_functions[function_name]
function_args = json.loads(response_message["function_call"]["arguments"])
answer = function_args.get("pass_param")
# recursive call
if answer:
return fuction_to_call(
message=message,iterator=iterator - 1)
else:
# model found answer
return response_message['content']
1 Like