I get openai.BadRequestError: Error code: 400 - {‘error’: {‘message’: “Invalid ‘file_ids’: array too long. Expected an array with maximum length 100, but got an array with length 300 instead.”, ‘type’: ‘invalid_request_error’, ‘param’: ‘file_ids’, ‘code’: ‘array_above_max_length’}}
So how can file_search support 10k files if vector stores can only have 100 files? I thought the files must be in the vector store to be used?
Actually ignore that. It seems like if I try to add 3 vector stores, I get an error.
“Invalid ‘tool_resources.file_search.vector_store_ids’: array too long. Expected an array with maximum length 1, but got an array with length 3 instead.”
Yep, you discovered there’s one store per assistant.
If that’s not validating your API call input because of a max length, you might have to batch through attaching more files individually. Method does not accept an array:
May I ask – why use these built-in vector stores when they are so limited compared to alternative solutions like Chroma or Pinecone? I am considering doing something similar to OP but am not sure what route to take
As far as I can tell, there is no way to have more than 20 files in the assistant playground interface though, because 1) it seems that you can use only vector store at a time and 2) vector stores cannot be created with more than 20 files, or be modified above that limit. Unless I’ve missed something?
All of my files show up though it does seem buggy. If I load too many by clicking more a few times, my browser will freeze.
Using ChatGPT Custom GPTs, I could only upload 10 at a time, so there may be a UI limitation. I also couldn’t delete well through the web though it worked well via PowerShell/API.