Is better to have multiple small batch jobs or few large ones?

What’s faster?

  1. one batch job that contains 2M tokens
  2. two batch jobs that contain 1M tokens each

Example using Tier 1 - gpt-4o-mini, 2,000,000 batch job limit

From what I have seen the best is to make it so that, each specific tasks are split up into smaller batch jobs, but take into consideration time as well, because a lot of small jobs will take longer time for API calls and such, so it depends on the use case, but usually most of the time from what I have seen smaller batch jobs performs better :slight_smile: