|
Need more completion_window options for batching, like 12h or multiple days
|
|
1
|
138
|
October 13, 2025
|
|
Can anyone help with batch API?
|
|
2
|
197
|
April 10, 2025
|
|
Batch file results encoding irregularities
|
|
3
|
288
|
March 21, 2025
|
|
Batch API for GPT 4o mini throwing a error 403
|
|
3
|
290
|
February 16, 2025
|
|
Error on tryng to use batches
|
|
6
|
987
|
November 11, 2024
|
|
Batching in OPENAI API Python
|
|
0
|
329
|
September 17, 2024
|
|
Need help with GPT 4 batching using Langchain wrapper
|
|
1
|
1962
|
August 1, 2024
|
|
What to pick for larger community end users? Bulk, Batch or?
|
|
2
|
102
|
July 17, 2024
|
|
Whisper has a too high error quote and hallucinates often
|
|
0
|
480
|
January 21, 2024
|
|
GPT-4v preview limiting batch requests?
|
|
5
|
1396
|
December 31, 2023
|
|
Batching with ChatCompletion Endpoint
|
|
11
|
34452
|
December 13, 2023
|
|
Issues with Rate Limiting and Batch Processing in OpenAI API
|
|
0
|
1995
|
November 11, 2023
|
|
GPT-3.5 fine-tune, submit-receive lists of data
|
|
0
|
565
|
October 22, 2023
|
|
Embedding model token limit exceeding limit while using batch requests
|
|
8
|
26197
|
October 15, 2023
|
|
How to approach batching calls using functions?
|
|
3
|
3038
|
December 14, 2023
|