Avoiding “Vector Store Size Limit Reached” in Long-Running Threads

I couldn’t find any topic on how to manage a constantly growing Storage, so I’m creating my own:
We have a technical support service that includes an integrated bot powered by OpenAI Assistants. Our clients use our platform to provide technical support to their own users. The idea is that our clients will use their own API keys and their own preconfigured Assistants (by Assistant ID) with the bot. The bot, based on OpenAI Assistants, processes the conversation between the end user seeking support and the support agent. Messages may include image content in addition to text, as well as file attachments.

The bot performs well: the logic is implemented such that the same Thread is used for a single support request, and as new messages arrive, they are added to the existing Thread, preserving the full context of the conversation.

During testing, we discovered that storage limits are reached rather quickly, and we encountered the error:
“Vector store size limit reached.”
To keep our system functioning as intended, we need to keep storage clean by regularly deleting unnecessary Files and Vector stores. Attempts to find best practices or strategies for cleaning storage—on this forum or elsewhere—have yielded no useful results, so I decided to experiment with deleting Files and Vector stores used in the current support Thread myself.

Immediately after adding messages (with Files) to a Thread, launching a Run, and receiving a response from the Assistant, I delete all the Files and Vector stores listed in thread.tool_resources.file_search.vector_store_ids. The Files and Vector stores are indeed deleted, but now the Thread contains references to Files and Vector stores that no longer exist—even though this Thread is still supposed to be used for processing future incoming messages.

I hoped that this wouldn’t be an issue and that the Assistant would simply ignore missing Files—but that’s not the case. When trying to process the same Thread again, I get an error like this:

Error while downloading https://prodfsuploads09.blob.core.windows.net/files/{file_id}?se=2025-05-04T18%3A02%3A13Z&sp=r&sv=2024-08-04&sr=b&rscc=max-age%3D3599%2C%20immutable%2C%20private&rscd=attachment%3B%20filename%3D111.png&sig=7DONF9D3ZsDL8/0kqWsHSXiDDxNPhqI8RMXQf/vQTtE%3D.

While writing this post, I realized that this error only seems to happen with images (see filename=111.png). Images are added to the Thread only as image_file content, not as attachments.
Also, when deleting an image file that was used as message content in a Thread, I might get this error:
“Failed to fetch image file: {file_id}. Check if the file has been deleted.”

I see that, in theory, it might be possible to automate the deletion of unnecessary Files from storage: The file object has an expires_at property—but there seems to be no way to set this property during file upload. So how can it be set?

Additionally, I noticed that when creating or modifying a Vector store, you can set the expires_after parameter—but there’s only one supported anchor: last_active_at. I’m wondering whether it’s possible to use the Vector store’s creation time as an anchor?

I’ve laid out the entire interaction flow between our service and OpenAI Assistants.
Our goal is to clean up Files and Vector stores to avoid filling up our clients’ storage while keeping the bot functional. I’d really appreciate any suggestions.