It has nothing do to with tier. I am tier 1 and i keep getting this error :
‘’‘Enqueued token limit reached for o3 in organization org-wMbOVqlCL6JLIVbM5oqQa7uj. Limit: 90,000 enqueued tokens. Please try again once some in_progress batches have been completed.’‘’
Started with a medium size input, which i had to divide in smaller and smaller inputs. I have no batch queued or running. Now i can not even run with a batch of 4 requests. When i try with earlier batches that used to work, it is not working anymore. There must definitly be a bug, or the message error is wrong and not indicating the corret limit issue.
Hi! Thanks for the effort, @OpenAI_Support. I don’t think it is solved, however. Take a look at, for example, batch_6899d526123481908e1422aacf0fba3e and batch_6899b1e29e508190a5d1d42e39ab66a6.
Edit: I reached out via help.openai.com, but was only able to talk to a chatbot… Will I get any feedback on the issue at all?
Edit-2: was counting the tokens wrongly, issue is fixed now! Make sure you’re using the right version of tiktoken, guys!
@j.n.fonseca This appears to be a different issue. In your case, you’re exceeding the token limit for tier-1. I recommend reaching out to our support team at support@openai.com so they can assist you further.