Undocumented batch behavior: output_file_id not returned at creation, must query /v1/batches/{id} to detect errors
|
|
4
|
78
|
May 21, 2025
|
Batches Missing In Console
|
|
0
|
42
|
May 17, 2025
|
Large Discrepancy in Output Token Size Between Two Identical GPT-4o-mini Batch Runs
|
|
5
|
70
|
May 16, 2025
|
GPT-4o reached enqueued token limit with a small batch job
|
|
6
|
1348
|
May 13, 2025
|
Error with batch api [o3-mini]
|
|
1
|
50
|
May 11, 2025
|
Enqueued token limit reached
|
|
10
|
724
|
May 9, 2025
|
Batch API "in-progress" more than 12H for GPT-4.1-mini
|
|
1
|
62
|
May 9, 2025
|
Batch API 403 Error no access to model
|
|
6
|
134
|
May 8, 2025
|
Batch API can be used together with Scale Tier?
|
|
9
|
86
|
May 7, 2025
|
Gpt-4o-mini batch api still bugged?
|
|
8
|
171
|
May 7, 2025
|
TTS batch price reduction
|
|
1
|
31
|
May 5, 2025
|
Batch API is returning error 400
|
|
1
|
62
|
May 3, 2025
|
Batch API Requests expiring without any progress at all
|
|
8
|
554
|
April 30, 2025
|
Batch Usage and token limit with gpt-4o-mini
|
|
0
|
61
|
April 23, 2025
|
Batch support for `o4-mini-2025-04-16` and `o3-2025-04-16`?
|
|
3
|
175
|
April 18, 2025
|
How to setup a batch api?
|
|
1
|
178
|
April 10, 2025
|
Prompt Caching in Batching API
|
|
2
|
153
|
April 6, 2025
|
/v1/completions endpoint not supported by gpt-4o, gpt-4o-mini
|
|
2
|
149
|
April 5, 2025
|
No batches in progress but get " Enqueued token limit reached"
|
|
40
|
1163
|
March 23, 2025
|
Batch API request limit violates terms of service
|
|
5
|
1208
|
January 8, 2025
|
Batch API jobs charging me regular API pricing
|
|
4
|
118
|
March 13, 2025
|
"Batch" API is actually a serial API. Please add a queue!
|
|
2
|
105
|
March 5, 2025
|
GPT4 Batch API returning nonsense
|
|
0
|
42
|
March 3, 2025
|
Using Batch API for fine-tuned models
|
|
1
|
134
|
March 1, 2025
|
How to prevent errors in GPT-4o batch processing?
|
|
1
|
84
|
February 27, 2025
|
Can API keys be limited to only use Batch API?
|
|
0
|
40
|
February 26, 2025
|
Error 500 (file_config is missing) when trying Batch API
|
|
0
|
118
|
February 23, 2025
|
Reasoning_effort="high" how to add this in the batchapi for o3-mini model
|
|
3
|
331
|
February 19, 2025
|
Embedding batches: randomly getting "Enqueued token limit reached for text-embedding-3-large" for rather small batches
|
|
1
|
112
|
February 18, 2025
|
[BUG REPORT] batch error: input is larger than the 209,715,200 maximum
|
|
3
|
165
|
February 14, 2025
|