Batch API is terribly behind everything else

Doesn’t support reusable prompts or json schema. The docs don’t say anything about this either, I basically wasted time creating a pipeline and then trying to use it directly with batches. does anyone know if this is on the docket?

It does support json_schema - you can still get structured outputs in stringified form. Of course, the prereq is that you’re using a model that supports both Batch API and structured outputs, but that should be almost all the modern models nowadays (like gpt-4o, gpt-4.1, o3, and gpt-5). In each of your input payload’s body, include:

'text': {
   'format': {
   'type': 'json_schema',
   'name': 'structured_response',
   'strict': True,
   'schema': <your structured output schema here>
   }
}

The result, as mentioned, will be in a stringified form, so just call json.loads() to convert each request’s output to a python dictionary once you download the response.

I just got official guidance from an openai rep via support telling me the batch api does not support jsonschema output

I just submitted a file with schema output and it worked lol

1 Like

Awesome - happy to hear that! Yeah, I can’t imagine support being very good or knowledgeable about current capabilities. When you have a technical question, just ask here.

1 Like