I am trying to utilize the Assistants API to retrieve data from files, and I am having issues with using the vector store.
I have tried uploading files both via the API and via https://platform.openai.com/assistants or /storage, and both methods leave me having the same problem.
I am trying to upload a txt file that is 153Kb. For the sake of the example I am going to use the web page as it’s more visual as to what’s going on. When actually selecting the file to load it, appears to be fine.
But upon hitting the attach button and the page updating, it shows that the upload failed.
On top of this, assigning files to a vector store is also buggy I guess. Files successfully reflected as uploaded are also not mapped to any Vector Store.
try changing name of the file. Instead of 500leads, try leads500 or leadfivehundred. I saw a post where it was mentioned that if your file name starts with capitol letter, it will throw an error.
I’m also seeing that if I switch between vector stores where the file upload has failed in the Storage tab on platform.openai.com that the entire page freezes and I have to force it to close.
As mentioned trying to delete the file results in a spinning wheel forever. I just checked the developer console and it looks like it might be an issue with updating the file? Let me try this in a different browser…
I am too facing the same issue; attaching files, uploading files using browser or API fails without any errors.
Are there any logs that can be checked; the API response just says error.
I reached out to their support chat and provided details on 5/7, I have yet to hear anything back yet. I hope this is on their radar, as it makes Assistants unusable for me until it gets fixed.
EDIT: Turns out the assistant can handle UTF-16 fine - my file is just improperly encoded. Instead of checking encode type, check if it is properly encoded.
➜ Desktop iconv -f UTF-16 filth_utf16.txt > filth_convert.txt
iconv: unexpected end of file; the last character is incomplete.
I still have the exact same issue. It is not working for one file but with an other one with similar content, its working fine.
Even weirder is that this is not only happening in the OpenAI platform interface and API but also in the Azure AI Studio with assitants and vector stores.
I actually got it to work now, turns out it blasted the token limit (5m tokens per file), even though file is only like 12mb. I batched 5000 rows at a time and uploaded 6 different files and it works perfectly now. This might be excessive but i have no clue how many tokens the original file had so i just tested with 5k and it worked.
Try to devide it in half and see if it works, if not make even smaller chunks.
it seems the thread is still posting, so I am adding what I have just found… When it fails, but if you try two to three times more, it completes uploading, which is so strange.