Function_call does not support JSON-SCHEMA features completely

Summary

# input:
"race": {
    "type":"string",
    "enum": ["orc","human","gremlin"]
}
# outout: dragon

Example

{
            "name": "generate_monster",
            "description": "generate monster with attributes.",
            "parameters": {
                "type": "object",
                "properties": {
                    "name": {
                        "type": "string",
                        "pattern": "^(\\([0-9]{3}\\))?[0-9]{3}-[0-9]{4}$"
                    },
                    "attack": {
                        "type": "number",
                        "minimum": 20,
                        "exclusiveMaximum": 100,
                        "description": "The temperature unit to use. Infer this from the monster attack."
                    },
                    "hp": {
                        "type": "number",
                        "minimum": 2000,
                        "description": "The monster hp"
                    },
                    "race": {
                        "type": "string",
                        "enum": [
                            "human",
                            "orc",
                            "gremlin",
                            "dragon"
                        ],
                        "description": "The temperature unit to use. Infer this from the monster race."
                    }
                },
                "required": [
                    "name",
                    "attack",
                    "hp",
                    "race"
                ]
            }
        }
1 Like

It would be really great if OpenAI gave more specifics about which features are supported. Besides functionality like the above, there are certain thing that would be useful to have to inject additional context into the function definition. For example json schema supports examples which seems like a nice way to add some k-shot context to the system prompt. However testing this, the calculated prompt_tokens coming back from openai api do not change when I include the examples in the function json schema, which means to me that they arent being used on the backend. Rather than me just wasting my time poking the api and seeing what happens, it would really be ideal if OpenAI gave more specifics.

Interesting, I had noticed that I got an error with the “array” type. Either at the top-level or or nested within an object.

Thanks for pointing out discrepancies in the “enum” type too. Had been assuming that worked based on what I’d seen but clearly just hadn’t tested enough values.

Don’t forget to add a system prompt!

System prompt: Use the information in this thread to populate the arguments of the function called in this message! Look at the function_name and make up the arguments that are missing!