Questions about File Upload Limits with the OpenAI File API

Hi, I have a question regarding OpenAI file uploads.

According to the File uploads FAQ, it seems that the number of files you can upload depends on your ChatGPT plan.

However, when I use the OpenAI File API, I don’t seem to run into this limitation.

  • For example, I uploaded around 100 files through the API and connected them to a Vector Store without hitting any restrictions.

From what I understand, the only clear limits are:

  • Maximum file size per upload: 512MB

  • Maximum total storage per organization: 100GB

So my questions are:

  1. Is there no specific limit on the number of files that can be uploaded via the File API?

  2. Is there a defined limit on the number of files that can be attached to a single Vector Store?

If anyone has official guidance or experience on these constraints, I’d really appreciate your input.

“ChatGPT” is when you pay for a subscription to a service on a web page.

The API is when you pay for the calls you make, and pay daily for the data from files that you attach to a vector store.

The files endpoint is free, and you can run it up to 1TB now (according to a patch in the Python SDK.) Asking an AI about said files - that’s where they get you in input token billings and file_search tool fees.

The challenge is in processing a large number of independent files for vector store attachment. Better done yourself one-file-per-call than batching a list of them to have many reported file failures.

Each vector_store can hold up to 10,000 files.

As for the Files API, I have no further references. Better open a support ticket for that.

1 Like

limit for this is 500 per batch

1 Like