Here’s an example on how to use assistant api ( cookbook . openai. com / examples / assistants_api_overview_python ). In the example, we need to specify the structure of the input for the external function display_quiz in function_json. How can we specify the structure of the output from display_quiz? In the example, its output is a list of string. But if its output is a list of str , for example, it’s a list of lists, do we need to specify it or llm can recognize it without any specification?
function_json = {
"name": "display_quiz",
"description": "Displays a quiz to the student, and returns the student's response. A single quiz can have multiple questions.",
"parameters": {
"type": "object",
"properties": {
"title": {"type": "string"},
"questions": {
"type": "array",
"description": "An array of questions, each with a title and potentially options (if multiple choice).",
"items": {
"type": "object",
"properties": {
"question_text": {"type": "string"},
"question_type": {
"type": "string",
"enum": ["MULTIPLE_CHOICE", "FREE_RESPONSE"],
},
"choices": {"type": "array", "items": {"type": "string"}},
},
"required": ["question_text"],
},
},
},
"required": ["title", "questions"],
},
}