I’m unable to save my GPT after spending the evening organizing and renaming documents in order to organize my prompts in accordance with the files.
There are probably 100-150 files. I constantly get the error “draft not saved”, and I cannot click to save it.
It appears that it saved at a point after 9 files were uploaded, but has not saved since after all that work. I also had to batch upload in chunks at a time.
All the files appear to be loaded in the platform, but it will not save
If people would just open the browser dev console, it is spitting out the error anytime you upload a file IF you already have 10 files uploaded succesfully. It will wait until the file is fully uploaded and then in the console gives:
‘Uncaught (in promise) FatalServerError: Cannot upload more than 10 files’
I have the same issue. Is there any documentation on why this is happening? Is there an explicit or implicit limit on the size of the knowledge base? If so, what is it and how do I remove lower-priority files from the knowledge base?
Here is what ChatGPT-4 suggests:
Based on the information gathered from various discussions and user experiences on the OpenAI Developer Forum, it appears that there are indeed limitations to the number of files you can upload and the overall size of these files when customizing GPTs. Here’s a summary of the key points:
File Number Limitation: There seems to be a hard-coded limit of around 10 files that can be uploaded for GPTs. This aligns with other user reports indicating a similar cap on the number of files that can be uploaded.
File Size and Content Limitations: Beyond the number of files, there are also constraints regarding the size of each file and the total content that can be uploaded. Users have reported that the system stops saving after 10 files, especially if more data is merged into a single file, indicating a limit to how much context the system can handle at one time. Another user found that a combination of 8 text files and 2 PDF files worked, but increasing the number of PDF files led to errors, suggesting that content type and file size play a role.
Documentation for Assistants: While specific information for GPTs is scarce, the documentation for Assistants on the OpenAI platform mentions a limit of 20 files per Assistant, with each file being no more than 512 MB, and the total size of all files not exceeding 100GB. This information might provide a rough guideline for GPTs as well.
Token Limits and Efficiency: There is also a consideration of the token limit before the quality of the GPT output begins to decline. One user speculated that this might be around 25,000 to 30,000 tokens per document.
Variability and Uncertainty in Limitations: Users have reported varying experiences with these limits, suggesting that there might be other factors at play, such as the specific content of the files or the way the platform is being used at a particular time.
In light of these findings, if you’re experiencing issues with saving drafts or uploading files, it may be beneficial to review the number and size of the files you are working with. Consider reducing the number of files, compressing them, or merging them into fewer, larger documents while keeping an eye on the total size. Additionally, for specific issues and solutions, you may want to explore the OpenAI Developer Forum or contact OpenAI support for more tailored assistance.
I have a similar problem here: I was able to upload the knowledge file but it is not retained - the file is gone after refreshing the page even after I clicked “save”. And GPT couldn’t access my uploaded document.
Having a similar problem here. “Unsaved changes” message displayed. Click “publish” and message indicated changes were published but “unsaved changes” still displayed. If I leave then return to the GPT configuration I find that the changes were not saved. I have two uploaded files and am not over either the usage or filesize caps.
It seems like this may be related to the usage cap? I have noted on several occasions where I have run into the problem of not being able to save my changes to a GPT, if I next try a query it will report that I have reached my usage cap. Is saving changes to a GPT somehow counting toward usage? If so, some indicator of remaining usage would really be helpful here so I won’t lose my work.
We’ve been running into this problem in spades. Can’t save our GPT. It’s incorrectly documented, or implemented… can we have 10 files or 20? … how large can they be?.. We seem to get into trouble with just 4 files. Maybe they can be 10MB? We also get the “This file contains too much text content.” message.
I realize GPT’s are still in Beta, not sure how OpenAI feels, but at least for us, this has become something of a showstopper bug.
I get an “Error saving GPT” message a lot. Sometimes (rarely) I can click save repeatedly and it’ll eventually save.