Problem with creating GPT4o-Vision Batch (Enqueued Tokens Limit)

Hi,

I’m trying to create a Batch Job with GPT-4 Vision. However, even when the batch only has a few lines, I get the following error:

“Enqueued token limit reached for gpt-4o in organization org-… . Limit: 90,000 enqueued tokens. Please try again once some in-progress batches have been completed.”

There are no batches in progress, and every batch size I’ve tried has failed.

If someone knows how to fix this, I would appreciate it.

Thanks in advance!

Welcome to the Forum!

If this batch queue limit is too low for you, then you need to consider moving to a higher usage Tier. Currently, it appears that you are in Tier 1. In order to move to a higher level, which then comes with a higher limit, you need to add more funds to your account. You can find the overview of eligibility criteria for the different tiers in the screenshot and link below.

Link: https://platform.openai.com/docs/guides/rate-limits/usage-tiers

Thanks for the quick answer.

I understand that my limits are not that high, but I am not exhausting my limits even in the slightest. I even tried it with a batch that only needs a few tokens, but I received the same error.

Am I misunderstanding something?

How large are your requests? What’s the size of your input/output tokens per request?

Or let me ask differently: have you tried to run just a single regular API call with one of your requests to get a better breakdown of your token usage? That would help to identify the potential root causes.

1 Like

Thanks again.

I was able to create a batch small enough to run. The only thing I don’t understand is that I was able to run a batch with more than the token limit with GPT-4-turbo just fine a month ago. Is this limit new?

No, the limits are not new. For gpt-4-turbo the same limit of 90,000 tokens applies under Tier 1. So not sure why this was possible :thinking:

@hackisack out of curiosity, do you see such error message when you call the batche.create() or when you call batches.retrieve()?

It was after creation and in the WebUI