When creating GPTs, the first thing to do is Which files look better when uploaded? Is there any way to process my files on my local computer before uploading? Then I uploaded a very long txt file, 14MB, and after uploading it says my text is too long? What should the limit be?
What is the maximum size of an uploaded PDF file?
broken link broken link broken link
https://platform.openai.com/docs/assistants/tools/file-search/supported-files
For
text/
MIME types, the encoding must be one ofutf-8
,utf-16
, orascii
.
File format MIME type .c
text/x-c
.cs
text/x-csharp
.cpp
text/x-c++
.doc
application/msword
.docx
application/vnd.openxmlformats-officedocument.wordprocessingml.document
.html
text/html
.java
text/x-java
.json
application/json
.md
text/markdown
application/pdf
.php
text/x-php
.pptx
application/vnd.openxmlformats-officedocument.presentationml.presentation
.py
text/x-python
.py
text/x-script.python
.rb
text/x-ruby
.tex
text/x-tex
.txt
text/plain
.css
text/css
.js
text/javascript
.sh
application/x-sh
.ts
application/typescript
Is this list accurate?
$ cat blah.py
f = open("demofile.txt", "r")
print(f.read())
$ cat blah.php
<?php
// PHP code goes here
?>
$ cat blah.rb
puts 'wut'
@jaaasshh What did you use as the ‘purpose’ when you uploaded the files?
There is no purpose in the UI i showed above. It’s just drag and drop.
Kinda feels like someone just broke everything. I’d include a link but An error occurred: Sorry, you can't include links in your posts.
nah
You can only use “openai.com” associated links, AFAIK. The alternative is to upload a screenshot.
I’ve uploaded python files into a vector store with no problems, but I use the python API method and not through the dashboard. Not sure what the default file purpose defaults to when you use the dashboard for vector stores.
So the solution for that looks like the use of the vector store object endpoint or the vector store file endpoint and use its status field. Not sure that relates to types of file being uploaded, though I think there’s a bug in the batch endpoint mentioned in that bug.
At this point to me this remains solved.