I am trying to write a chatbot to help me write RPG lore.
I already wrote a function that send back a piece of lore as a json.
Now I want to be able to tell the model that this lore is fine by me and to save it.
So I wrote an other function:
def save_content(content: str):
“”"
Save the content to a file when the user asks to do so.
Args:
content (str): The content to be saved.
Returns:
str: A message indicating that the content has been saved.
"""
with open('content.txt', 'w') as file:
file.write(content)
return "Content saved!\n"
I added this function to the list of function that can be called and I updated the prompt to tell the model that if asked to save, it should use this function.
But this does not seem to work.
I am using gpt-3.5-turbo-0613. Any ideas ?
{
"name": "writeFile",
"description": "Write the contents of a file",
"parameters": {
"type": "object",
"properties": {
"path": {
"type": "string",
"description": "The file path to write into"
},
"content": {
"type": "string",
"description": "The content to write"
}
},
"required": ["path","content"]
}
}
You must handle the required_action state using this function:
// Write the contents of a specified file
function writeFile(args) {
try {
const parsedArgs = JSON.parse(args);
console.log(parsedArgs);
const { path: filePath, content } = parsedArgs;
// Validate content is a non-empty string
if (typeof content !== 'string' || content.trim() === '') {
return { success: false, message: 'Content to be written is invalid or empty.' };
}
// Ensure that the directory exists (create if not)
const directory = path.dirname(filePath);
fs.mkdirSync(directory, { recursive: true });
// Write the file
fs.writeFileSync(filePath, content, { encoding: 'utf-8' });
console.log({ success: true, message: 'File written successfully.' });
return { success: true, message: 'File written successfully.' };
} catch (error) {
console.log(error)
// Handle specific errors
switch (error.code) {
case 'EACCES':
return { success: false, message: 'You do not have permission to write to this file.' };
case 'ENOSPC':
return { success: false, message: 'There is not enough space on the disk to save the file.' };
case 'ENOENT':
return { success: false, message: 'The specified file path could not be found.' };
default:
return { success: false, message: 'An unexpected error occurred while writing the file. Please try again.' };
}
}
}
Thanks for the answer.
That’s already more or less what I did but in python.
I think the problem is more that the function is never called.
I am new to this so I may be doing something wrong, but I asked the model in the system prompt to specifically call this function when I ask it to save the created content:
You are a DM helper. You help dungeon master to create interesting pieces of lore for places. When asked to create a place you call the place_extract_function and you provide a name for the place if none is provided, the region in which this place exists, the population number, a detailed description of the place and a summary of this lore. You only give those informations. You don’t great the user, only the place information. If asked to modify the lore, you try to modify the minimum amount to include the modifications. If asked to save the newly generated content you call the save_content function.
the function is minimal for now:
def save_content(content: str):
“”"
Save the content to a file when the user asks to do so.
Args:
content (str): The content to be saved.
Returns:
str: A message indicating that the content has been saved.
"""
with open('content.txt', 'w') as file:
file.write(content)
print("save_content function here")
return "Content saved!\n"
custom_functions.append(
{
“name”: “save_content”,
“description”: “Write the contents of a file”,
“parameters”: {
“type”: “object”,
“properties”: {
“content”: {
“type”: “string”,
“description”: “The content to write”
}
},
“required”: [“content”]
}
}
Thanks, but I’ve not worked with Assistants API… yet…
Oh, you meant the RPG lore bit! Hrm…
Sounds like a good idea, though. I think I’m going to do an API for my SaaS then hook it up to a Custom GPT or Assistant eventually, so things might be a bit different. Also sounds like you’re on the right track.
# Call a function if there is one
messages.append({"role": assistant_message.role, "content": assistant_message.content})
if assistant_message.tool_calls:
results = execute_function_call(assistant_message,client)
messages.append({"role": "function", "tool_call_id": assistant_message.tool_calls[0].id, "name": assistant_message.tool_calls[0].function.name, "content": results})
return messages
match function_name:
case "save_place":
results = save_place(arguments)
case "create_place":
results = create_place(arguments,client)
case "modify_place":
results = modify_place(arguments,client)
case "create_character":
results = create_character(arguments,client)
case "modify_character":
results = modify_character(arguments,client)
case "create_other":
results = create_other(arguments,client)
case "modify_other":
results = modify_other(arguments,client)
case _:
results = f"Error: function {function_name} does not exist"
return results