GPT-5.1 and GPT-5.2 have their model page documentation now updated to include batch.
Please note and review GPT-5.2 Pro, however:
https://platform.openai.com/docs/models/gpt-5.2-pro
Like the other GPT-5.2 previously, it does not have the “batch” flag set on the model listing page. GPT-5 Pro however is a batch model, so we know the pro model in general can be batched.
Additionally, there are challenges reported in running it on the batch endpoint, but the calls are not refused, just symptomatic for an API user.
Actions
- Update the Pro documentation model listing page with whatever truth for “batch” capability;
- Investigate and replicate calls and symptoms of “gpt-5.2 batch runs repeatedly at high expense”
Thanks for your attention.