Assistants API with local vector store


Is it possible to create an assistant that uses a vector store not stored on openai? I want to avoid uploading files and vector stores to openai and have the assistant use vector stores that are stored locally. Are there any documentation or guides on how to do this?


how it can be possible? if you don’t add that to the instructions nor the thread the assistant will be not aware of your data

Yes it is possible. I would create a tool which gets the context from user input. Within the tool I will create and update my chroma collection. Query the vector db with user input and get the text related to the question and pass it to LLM to create a poll and run if async.