"Retrieve file content" API always fails with "502 Bad gateway"
|
|
6
|
174
|
August 14, 2025
|
Enqueued token limit reached
|
|
25
|
1320
|
August 12, 2025
|
Why does Batch API show more requests than expected?
|
|
6
|
112
|
July 31, 2025
|
Can we use file uploads (e.g. file IDs) in batch requests to /v1/responses?
|
|
1
|
56
|
July 24, 2025
|
Requests per limit rate limit exceeded in Batch API
|
|
2
|
101
|
July 9, 2025
|
Batch API is not supported for o3-pro, despite claiming to be supported on model card
|
|
6
|
287
|
July 1, 2025
|
Batch Usage and token limit with gpt-4o-mini
|
|
1
|
193
|
June 29, 2025
|
Batch API - System Prompt Caching - Is it possilbe to cache system prompt from single batch job and reuse it across multiple batches?
|
|
2
|
306
|
June 11, 2025
|
Inconsistent Number of Entries in JSONL Files from OpenAI Batch API
|
|
1
|
115
|
June 5, 2025
|
ENQUEUED_TOKENS not reset when the batches are completed
|
|
0
|
52
|
May 31, 2025
|
Don' see any option to request increase in api rate limit for batch API
|
|
3
|
150
|
May 27, 2025
|
Undocumented batch behavior: output_file_id not returned at creation, must query /v1/batches/{id} to detect errors
|
|
4
|
111
|
May 21, 2025
|
Batches Missing In Console
|
|
0
|
59
|
May 17, 2025
|
Large Discrepancy in Output Token Size Between Two Identical GPT-4o-mini Batch Runs
|
|
4
|
111
|
May 16, 2025
|
GPT-4o reached enqueued token limit with a small batch job
|
|
6
|
1893
|
May 13, 2025
|
Error with batch api [o3-mini]
|
|
1
|
100
|
May 11, 2025
|
Batch API "in-progress" more than 12H for GPT-4.1-mini
|
|
1
|
149
|
May 9, 2025
|
Batch API 403 Error no access to model
|
|
6
|
249
|
May 8, 2025
|
Batch API can be used together with Scale Tier?
|
|
9
|
147
|
May 7, 2025
|
Gpt-4o-mini batch api still bugged?
|
|
8
|
255
|
May 7, 2025
|
TTS batch price reduction
|
|
1
|
62
|
May 5, 2025
|
Batch API is returning error 400
|
|
1
|
101
|
May 3, 2025
|
Batch support for `o4-mini-2025-04-16` and `o3-2025-04-16`?
|
|
3
|
294
|
April 18, 2025
|
How to setup a batch api?
|
|
1
|
307
|
April 10, 2025
|
Prompt Caching in Batching API
|
|
2
|
333
|
April 6, 2025
|
/v1/completions endpoint not supported by gpt-4o, gpt-4o-mini
|
|
2
|
224
|
April 5, 2025
|
No batches in progress but get " Enqueued token limit reached"
|
|
40
|
1517
|
March 23, 2025
|
Batch API request limit violates terms of service
|
|
5
|
1736
|
January 8, 2025
|
Batch API jobs charging me regular API pricing
|
|
4
|
162
|
March 13, 2025
|
"Batch" API is actually a serial API. Please add a queue!
|
|
2
|
138
|
March 5, 2025
|