Both these models are supposed to support batching, but don’t seem to support it when you try to use it.
Line 1 The provided model 'o4-mini-2025-04-16' is not supported by the Batch API.
Line 1 The provided model 'o3-2025-04-16' is not supported by the Batch API.
Docs that say batching is supported:
https://platform.openai.com/docs/models/o4-mini
https://platform.openai.com/docs/models/o3
1 Like
andyw1
2
This is enabled now, sorry for the trouble!
4 Likes
Thanks, I can confirming it’s working for me!
2 Likes
vb
Closed
4
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.