I have the exact same problem. I have enough balance in my account, and I’m the sole user of my account with a single API key.
I was uploading small jsonl files one by one to 4o, each 50k tokens input and 8.5k tokens in output, well-below the 90k threshold for tier 1. Yet, even though there are no batches running, I keep getting "
Enqueued token limit reached for gpt-4o in organization ###. Limit: 90,000 enqueued tokens. Please try again once some in_progress batches have been completed.".
I get the same error even when I upload the files one by one by hand through the Web UI, the files instantly fail upon uploading. Then after 1 hour, I’m able to upload the same file that previously failed with no issues, but I have to wait for another hour before I can upload the next one.
For reference, I have the max_tokens variable set to 100 per prompt in the file (outputs are always 84 tokens since I’m outputting integer values in a json output).
The problem is, I have about 100 files, and do not have the patience to wait for this bug…