Model GPT-4o doesn't have the same capabilities as the named model it's pointing to

It would be really great to have some confirmation of this from openai. Although as noted the documentation says gpt-4o is compatible, this:

client.beta.assistants.create(
...
    model = "gpt-4o",
    response_format={
        'type': 'json_schema',
...

produces:

BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version `gpt-4o`.",
1 Like

This old topic is from before json_schema as a response_format existed.

A question “is it compatible” now has a different answer from a different cause.

Just like this company screwed up sending images to the alias “gpt-4-turbo”, it looks like here with assistants, the alias “gpt-4o” is screwed up and doesn’t reflect the capabilities of what it points to within assistants, which is "gpt-4o-2024-08-06".

Trials

“gpt-4o”: Not supported with model version
“gpt-4o-2024-08-06”: OK
“gpt-4o-mini”: OK
"gpt-4o-mini-2024-07-18: OK
“gpt-4o-2024-05-13”: Not supported (expected response)

Thus, you will have to specify “model=gpt-4o-2024-08-06” in your API request until the issue with the pointer to it is fixed.


Have a script that will test for screwups in the future, with a list of models to go through when creating assistants (then deleting them). It will also show how a working schema for assistants appears and is to be sent.

from openai import OpenAI
client = OpenAI()

for model in ["gpt-4o-2024-08-06"]:
    assistant_id = None
    try:
        assistant = client.beta.assistants.create(
            instructions="You help, employing your pretrained world knowledge.",
            name="TEST schema bot",
            tools=[],     # no code_interpreter, no file_search
            model=model,  # "gpt-4o-2024-08-06" or "gpt-4o-mini-2024-07-18"
            # Python openai library accepts python object, not JSON string, as response_format
            response_format={
"type": "json_schema",  # START SCHEMA CONTENT
"json_schema": {
  "name": "response_format_schema",
  "strict": True,
  "schema": {
    "type": "object",
    "properties": {
      "text_to_user": {
        "type": "string",
        "description": "The response that the user will see."
      },
      "disposition_of_user": {
        "type": "string",
        "description": "The mood expressed by the user.",
        "enum": [
          "neutral",
          "positive",
          "negative"
        ]
      },
      "ai_was_helpful": {
        "type": "boolean",
        "description": "Indicates if the AI provided a perfect solution to a problem."
      }
    },
    "required": [
      "text_to_user",
      "disposition_of_user",
      "ai_was_helpful"
    ],
    "additionalProperties": False
  }
}  # END SCHEMA CONTENT
        })
        assistant_id=assistant.id  # get the ID returned from the pydantic method in response
        print(model, " - created ID: ", assistant_id)

    except Exception as the_API_barfed:
        print(the_API_barfed)
    if assistant_id:
        try:
            deleted_response = client.beta.assistants.delete(assistant_id)
            print(deleted_response)
        except Exception as the_API_delete_failed:
            print(the_API_delete_failed)
1 Like