|
"Batch" API is actually a serial API. Please add a queue!
|
|
2
|
218
|
March 5, 2025
|
|
GPT4 Batch API returning nonsense
|
|
0
|
78
|
March 3, 2025
|
|
Using Batch API for fine-tuned models
|
|
1
|
271
|
March 1, 2025
|
|
How to prevent errors in GPT-4o batch processing?
|
|
1
|
180
|
February 27, 2025
|
|
Can API keys be limited to only use Batch API?
|
|
0
|
113
|
February 26, 2025
|
|
Error 500 (file_config is missing) when trying Batch API
|
|
0
|
362
|
February 23, 2025
|
|
Reasoning_effort="high" how to add this in the batchapi for o3-mini model
|
|
3
|
772
|
February 19, 2025
|
|
Embedding batches: randomly getting "Enqueued token limit reached for text-embedding-3-large" for rather small batches
|
|
1
|
209
|
February 18, 2025
|
|
[BUG REPORT] batch error: input is larger than the 209,715,200 maximum
|
|
3
|
403
|
February 14, 2025
|
|
Batch API for GPT 4o mini throwing a error 403
|
|
3
|
294
|
February 16, 2025
|
|
Gpt-4o Batch Processing Jobs Response Time Increased Significantly, Causing Job Timeouts/failures
|
|
2
|
623
|
February 3, 2025
|
|
Batch API Jobs Expiring Prematurely Using GPT-4o
|
|
0
|
223
|
February 3, 2025
|
|
Batch API chat completion with Uploaded Files
|
|
5
|
462
|
January 31, 2025
|
|
"You are not allowed to request logprobs" Error in Batch Processing with GPT-4o-mini
|
|
4
|
817
|
January 28, 2025
|
|
Structured output on batch-api giving incomplete results
|
|
3
|
368
|
January 25, 2025
|
|
Is better to have multiple small batch jobs or few large ones?
|
|
1
|
375
|
January 19, 2025
|
|
Using Batch API Without uploading file
|
|
1
|
408
|
January 13, 2025
|
|
Confused about OpenAI Batch API (GPT-4o-mini) pricing – Why are the total costs higher?
|
|
7
|
2961
|
October 18, 2024
|
|
Newlines in batch mode prompts
|
|
1
|
350
|
January 4, 2025
|
|
How can I run multiple batch jobs in parallel that are close to the queue limit?
|
|
0
|
316
|
January 2, 2025
|
|
How to generate descriptions for 10k images with Batch API?
|
|
0
|
199
|
December 27, 2024
|
|
Some batches creation FAILED even though they were within the batch queue limit
|
|
0
|
388
|
December 23, 2024
|
|
Increase Batch API request limit beyond 1,000,000 requests queued
|
|
2
|
376
|
December 22, 2024
|
|
Batch API Request: Maintain the context of the conversation
|
|
0
|
87
|
December 15, 2024
|
|
Evaluating multiple PDFs documents using a batch process
|
|
4
|
1267
|
December 13, 2024
|
|
Can Batch api work with prompt caching?
|
|
4
|
1882
|
December 6, 2024
|
|
Batch API minimum completion window time
|
|
0
|
222
|
December 5, 2024
|
|
Batches don't work at all
|
|
18
|
2145
|
November 28, 2024
|
|
Batch processing gets different results than individual requests
|
|
0
|
314
|
November 27, 2024
|
|
BATCH Api max_tokens and temperature default values
|
|
0
|
386
|
November 25, 2024
|