Gpt-5.1 & 5.2 Batch Availability

Per the website these models are not available on batch. However, I sent requests and they all worked fine. Is there any reason for this? Am I secretly being routed to gpt-5?

2 Likes

Thanks for sharing the report of more options that are not set correctly in per-model documentation.

You can visit the pricing sheet and see there the populated data that indicates a batch price discount, from which you can infer your success:

A good place to visit before calls with new models and methods anyway, to discover any “gotcha” pricing you’d not expect.

1 Like

Thank you @_j for helping @toastywaffles888. Please let us know if you need help with anything else. More than happy to help.

1 Like

GPT-5.1 and GPT-5.2 have their model page documentation now updated to include batch.


Please note and review GPT-5.2 Pro, however:

https://platform.openai.com/docs/models/gpt-5.2-pro

Like the other GPT-5.2 previously, it does not have the “batch” flag set on the model listing page. GPT-5 Pro however is a batch model, so we know the pro model in general can be batched.

Additionally, there are challenges reported in running it on the batch endpoint, but the calls are not refused, just symptomatic for an API user.

Actions

  • Update the Pro documentation model listing page with whatever truth for “batch” capability;
  • Investigate and replicate calls and symptoms of “gpt-5.2 batch runs repeatedly at high expense”

Thanks for your attention.

Thank you for providing the update.