Enqueued token limit reached

It has nothing do to with tier. I am tier 1 and i keep getting this error :

‘’‘Enqueued token limit reached for o3 in organization org-wMbOVqlCL6JLIVbM5oqQa7uj. Limit: 90,000 enqueued tokens. Please try again once some in_progress batches have been completed.’‘’

Started with a medium size input, which i had to divide in smaller and smaller inputs. I have no batch queued or running. Now i can not even run with a batch of 4 requests. When i try with earlier batches that used to work, it is not working anymore. There must definitly be a bug, or the message error is wrong and not indicating the corret limit issue.

PLEASE FIX

1 Like

batch_688f9cacecd4819099e8bd9ccfbee9b9

batch_688f9d034a2481908377f94b17f0edb9

Hello! We've fixed this issue. If you are still seeing the issue, please reach out to us via help.openai.com with batch IDs.

2 Likes

Hi! Thanks for the effort, @OpenAI_Support. I don’t think it is solved, however. Take a look at, for example, batch_6899d526123481908e1422aacf0fba3e and batch_6899b1e29e508190a5d1d42e39ab66a6.

Edit: I reached out via help.openai.com, but was only able to talk to a chatbot… Will I get any feedback on the issue at all?

Edit-2: was counting the tokens wrongly, issue is fixed now! Make sure you’re using the right version of tiktoken, guys!

@j.n.fonseca This appears to be a different issue. In your case, you’re exceeding the token limit for tier-1. I recommend reaching out to our support team at support@openai.com so they can assist you further.

2 Likes

You are correct, I ended up solving the issue. :slight_smile: Thanks!

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.