Hi there
I am using batch API but it’s quitely slow from yesterday. One batch (2M tokens per batch) tooks 20minutes but now, one batch not finished during 12hours. It’s usual situation?
Same here. I even cancelled a batch to see if the system was responding. It is. The batch got cancelled.
Maybe batch api is something wrong now i guess.
The batch API has a fulfillment within 24 hours as its service level.
You are submitting a job for completion in off-peak hours or for idle compute.
It is that efficient employment of resources, scheduled by OpenAI, that gives you half-price. It is a perk that some jobs aren’t held back to simply reduce the timeliness.
Yes, a 24-hour completion window is expected.
However, I’ve been using batches for months, and the performance has generally been very good. Occasionally, there were slight delays of several minutes before requests started to be processed. But what has been happening since yesterday is unacceptable. I waited 11 hours and 18 minutes to complete a test batch of 2 requests. Another batch finished in 10 hours. There have been no improvements since then.
Clearly, something has changed, making batches unusable for many use cases.
My batch status is “in progress 0/N” for more than 12 hours! Any ideas to solve it?
Hey, i’ve been using for a few months the batch API and the batches have been processed in less than five minutes. Today a regular batch is taking 5 hours. Horrible for developing.
Does it just fixed on subsequent days for you?
The batch API hasn’t been working again for me for the past 3 days. Batch requests are expiring after 24 hours with 0 completed. It’s ridiculous that OpenAI isn’t mentioning this outage on status.openai.com
Me too me too very slow. And cancle
Me too, the batch API stop working, stuck at in-progress for the whole day.
I have had a similar problem for this week.
Even a batch with 1 or 13 requests could not be finished and expired.
I am running the “gpt-4o-mini” model.