Enqueued Token Limit Reached Even Though No Batches Are Processing

I was using the GPT-4i mini-batch API (Tier 1), and it was working fine until yesterday. However, suddenly, regardless of the file size I upload, I’m encountering the error: “Enqueued token limit reached for GPT-4o mini.” I have 3,000 batch requests, and the token size is well under 2 million. Additionally, the file size is only around 15 MB.

1 Like