API Docs appear to be mum on which roles and params are accepted by o1 models (temperature, system, etc.). Can pass have system role in o1 chat messages, but not o1-mini chats. See error thrown below when calling o1-mini. Match anyone else’s experience?
messages = [{“role”: “system”, “content”: system_message}…]
An error occurred: Unsupported value: ‘messages[0].role’ does not support ‘system’ with this model.
OpenAI has completely obliterated the existing documentation for September API versions of o1-mini and o1-preview, in typical legacy-destroying fashion. It refers to models without version.
It is more intertwined and confabulated because the documentation seems to refer to a o1-mini that is a replacement for o1-mini. Rolled out to an undocumented number of tier 5 users by unknown qualifications.
The correct role to pass to the newest version of o1–preview is “developer”, and system being accepted is a fluke contrary to docs (yet it seems like a different role name was never needed at all since they are transforming inputs to API models) and the prior 2024-09-xx that still include the documented o1-mini only accept “user” (and “assistant” as needed for conversation continuation).
The advice updates should be incorporated in the actual API docs (eg, in the sections dedicated to relevant endpoints). In any event, system messages currently are accepted by ‘o1’, though not ‘o1-mini’ or ‘o1-preview’. Perhaps an unintended fluke.
i bypassed the issue by using BAML (boundaryml) which let you use o1 and subsequent models with structured outputs and no need to specify system or role. life saver for using these models rn
I just used it across the simple examples they provide in the docs (including structured outputs), but not in production setting. I typically work in Python, and I believe ell is a Python package - not sure if there are JavaScript bindings