What will happen when meet Assistant 100GB file limit per Organization

I have two questions on the Assistant 100GB file limit:

  1. What is the 100GB limit per Orientation meaning? is it mean for one Region endpoint? Or One Same OpenAI Key? Or One Assistant?

  2. What will happen when meet Assistant API 100GB limit per Organization? Can I continue upload file to Assistant? or the old file will be deleted automatically.

Appreciate your help, thanks

1 Like

“Per organization”. A login account generally has one organization: your company business ID associated with a payment method, credits.

The organization is the separation between your data and my API data. It is the common container for all projects, keys, team members, assistants, etc.

If you were to attempt to upload gigabyte 101 to the files endpoint, you would get an API error that reports the reason. You would need to delete some existing storage files. I’ve never hit it to see the mechanism at work.

It is rather annoying that there is no dashboard with your current storage usage and file count, as it is a number that must be otherwise available quickly by the backend in order for any upload limiting to work.

Please contact us if you need to increase these storage limits.

1 Like

Thank you for replying.

I have another question:

if I keep uploading files with different assistants, let’s say I have 1000 assistants who are sharing the same organization, will all those assistants are sharing the same 100GB, and does this 100GB count the assistant generated files (for example the generated images etc.)

1 Like

Yes, outputs from code interpreter, such as data files or diagram image files it creates, also go into the same single storage pool.

While inactive threads are deleted after 60 days, inactive uploaded and output files are not.

Therefore capture of file_ids as they are produced, storage in a database connected with the user and session, and an expiration delete policy when they are no longer useful, is the maintenance you must do to not have a growing number of orphan files.

The only separate storage beyond “files” is that of vector stores used with file search. While it is based on uploaded files in storage, the extracted text and vectors is in its own storage allotment, billed per GB per day.

2 Likes

Thank you so much for the explanation.

Another issue we are facing now is that we already have a lot of file uploaded, and we didn’t maintain those file_ids and also we didn’t maintain the generated file_ids, all those files still existing in our Current Organization storage. because we don’t have the delete policy. there’s any way to delete all the files in the current OpenAI storage?

1 Like

You can only delete one file at a time by API, by its ID.

Here’s a utility for deleting them all. It gets a list of all the files for one “files” purpose, and goes to work deleting them, about one per second.

Use with caution, as it WILL delete everything.

1 Like