Hello.
I was trying to generate embeddings in a Bacth API (text-embedding-3-small).
After completing about 20 batches, when I tried to submit the input files, the status became failed and I could not submit.
The error message said that the enqueued tokens had reached 3,000,000, but all the batch processes should have been completed.
(The input texts were less than 3,000,000 tokens.)
A day or two later, we tried the failed file again, and this time it processed successfully.
Are there any other limitations besides the 3,000,000 tokens?
Is the following page relevant?
https://platform.openai.com/docs/guides/rate-limits#error-mitigation
Thanks.