Large Discrepancy in Output Token Size Between Two Identical GPT-4o-mini Batch Runs
|
|
2
|
35
|
May 15, 2025
|
Undocumented batch behavior: output_file_id not returned at creation, must query /v1/batches/{id} to detect errors
|
|
3
|
50
|
May 14, 2025
|
GPT-4o reached enqueued token limit with a small batch job
|
|
6
|
1283
|
May 13, 2025
|
Error with batch api [o3-mini]
|
|
1
|
41
|
May 11, 2025
|
Enqueued token limit reached
|
|
10
|
687
|
May 9, 2025
|
Batch API "in-progress" more than 12H for GPT-4.1-mini
|
|
1
|
58
|
May 9, 2025
|
Batch API 403 Error no access to model
|
|
6
|
91
|
May 8, 2025
|
Batch API can be used together with Scale Tier?
|
|
9
|
78
|
May 7, 2025
|
Gpt-4o-mini batch api still bugged?
|
|
8
|
166
|
May 7, 2025
|
TTS batch price reduction
|
|
1
|
29
|
May 5, 2025
|
Batch API is returning error 400
|
|
1
|
56
|
May 3, 2025
|
Batch API Requests expiring without any progress at all
|
|
8
|
541
|
April 30, 2025
|
Batch Usage and token limit with gpt-4o-mini
|
|
0
|
57
|
April 23, 2025
|
Batch support for `o4-mini-2025-04-16` and `o3-2025-04-16`?
|
|
3
|
153
|
April 18, 2025
|
How to setup a batch api?
|
|
1
|
153
|
April 10, 2025
|
Prompt Caching in Batching API
|
|
2
|
131
|
April 6, 2025
|
/v1/completions endpoint not supported by gpt-4o, gpt-4o-mini
|
|
2
|
138
|
April 5, 2025
|
No batches in progress but get " Enqueued token limit reached"
|
|
40
|
1129
|
March 23, 2025
|
Batch API request limit violates terms of service
|
|
5
|
1146
|
January 8, 2025
|
Batch API jobs charging me regular API pricing
|
|
4
|
113
|
March 13, 2025
|
"Batch" API is actually a serial API. Please add a queue!
|
|
2
|
102
|
March 5, 2025
|
GPT4 Batch API returning nonsense
|
|
0
|
41
|
March 3, 2025
|
Using Batch API for fine-tuned models
|
|
1
|
127
|
March 1, 2025
|
How to prevent errors in GPT-4o batch processing?
|
|
1
|
83
|
February 27, 2025
|
Can API keys be limited to only use Batch API?
|
|
0
|
40
|
February 26, 2025
|
Error 500 (file_config is missing) when trying Batch API
|
|
0
|
112
|
February 23, 2025
|
Reasoning_effort="high" how to add this in the batchapi for o3-mini model
|
|
3
|
306
|
February 19, 2025
|
Embedding batches: randomly getting "Enqueued token limit reached for text-embedding-3-large" for rather small batches
|
|
1
|
103
|
February 18, 2025
|
[BUG REPORT] batch error: input is larger than the 209,715,200 maximum
|
|
3
|
160
|
February 14, 2025
|
Batch API for GPT 4o mini throwing a error 403
|
|
3
|
160
|
February 16, 2025
|