Issue with o1-mini Model's Response Structure

I need to interact with the o1-mini API. I am using o1 to solve math problems.
The prompt is {“Please answer the question” + “Question body”}.
I am using a MongoDB database to store the returned information.
Currently, other models require the return in JSONL format.

Can I request o1-mini to return the response in the same way I use for the 4o model, for example,the prompt for 4o is

### Output Format: 
Provide the result in JSON format:

{
  "problem_uuid": "{problem_uuid}",
  "overall": "equivalent" or "not equivalent",
  "explanation": "Provide a detailed explanation of the equivalence judgment, focusing on the similarity of the final answers even if the methods differ."
}

Can I request o1-mini to return the result in this prompt:

“Please answer the question” + “Question body”
### Output Format:
Provide the result in JSON format:

{
  "problem_uuid": "{problem_uuid}",
  "calculate_again_process": "Provide the solution process"
}

If not, how should I make o1-mini return a suitable result that can be stored in MongoDB?
I guess it’s enough for o1-mini to use a very simple prompt like “Please answer the question”, right?
Thank you very much.

Have you seen

Note: I did not look at the examples there to see if they specifically can help you but knowing there is a location for such OpenAI cookbook o1 examples is of value.

3 Likes

From documentation o1 won’t produce json strictly.

Most probably json will be embedded as part of the output with explanations etc

I recommend calling gpt4o/mini after the result to extract json as structured output as this is supported from 08/09 model versions

1 Like