Error saving draft when building a new GPT

So there is a shadow cap of 10 files? Good to know but I hope it will be changed to size and not number. Have anyone tried to upload one huge files instead of many small?

It’s not realy a shadow cap, it says in the console, they are just missing a toast notification. It’s the same cap as uploading files to a chat massage.

I combined my technical documentation into a few very long PDFs, that seems to work quite well even on 10k+ pages, just sometimes the “searching in knowledge” seems to time out, but that might also be related to the outages recently.

1 Like

Problem solved: It seems there is a limit to the number of files you can upload. With up to 10 files there is no such error message on my end.

1 Like

It does work sometimes, but for the most part of my journey building GPT, he keeps forgetting instructions, documents are randomly showing duplicates (it didn’t delete docs which i deleted before, instead they are magically placed back to the backend). I need to save all conversations with gptmaker from now on, to paste them when refreshed, but not sure if that would work if issues won’t be resolved. I am losing a lot of time repeating same tasks over and over again.

I am also getting that error. It’s really frustrating! No explanation. I can upload a few files, then it just gives me that generic “Error saving draft”. Super annoying! I can’t create my gpt without giving it files to have access to.

good idea. any ideas on how to combine lots of text files? I saved a ton of websites as text (gpt didn’t like html files) and I’m thinking there should be a way to combine these into one text file? I also wish to combine pdfs the same way. This could be a good workaround for the apparent 10 file upload limit.

  1. Combine
  2. Split

I have uploaded about 250.000 historical / science / news articles this way (yes, I am feeding my public baby here).

SQL dump to .txt (about 750mb size) and split in 100 separate files (to keep the context for GPT).

image

Uploading is a “pa1n 1n th3 @ss” because of all the errors.

I upload every file one by one, but about every 2 files out of 3 are rejected because of the “unable to save draft” error.

Even authorizing my public server (which is done) went wrong, had to try about 25 times until it was fetched by DNS.

Be sure to set the line breaks and UTF coding correct when handling text files.

Also you can write yourself a simple php or python script to make adjustments to the merge / split.

It’s very glitchy for me - I can’t update a profile picture (sometimes it updates, but then reverts back after I refresh), most of the prompts generating errors as well. Not good.

smth went wrong
Every time I try to prompt I get this.

2 Likes

I see that the custom GPT works fine until you upload more than 10 documents. On the 11th document it gives out the error.

1 Like

Right. It is a 10 file maximum, it seems. ANything more than that and you will need to work through the API, I assume

You can’t have more than 10 files total, I believe. SO I don’t understand how this process would work for you? If you have 100 separate files, it likely won’t save them. They will upload but be gone on refresh. Need to use API for more files, I am guessing.

What I did is that I batched my 11 files into 1 and it did not complain. So perhaps massive files would work…buts it’s a bit of a legwork especially with the dude that has 250,000 research papers earlier on @Foo-Bar

It’s not working.

I am stuck on 10 files and have 100 of them.

But when I upload larger files, it also refuses.

So I can not upload the knowledge I have.

Yeah, its probably a temporary restriction…I suspect it is because of the DDoS attacks earlier

1 Like

I like knowledge, so why do less… :slightly_smiling_face:

It’s ten years of research I am uploading.

Also I asked at forehand if there was a limit in size or amount and he said “no”…

3 Likes

Maybe this comes in handy for others.

I have a knowledge base about a “very specific” subject.

About 250.000 articles (news, science, historic) in one solid SQL database.

I could not upload those files as plain text-files, because I was limited to 10 files max (I have 100 files for this subject).

And when I created larger files to “fit” in 10 files, GPT complained there was too much text in it.

So I created a SQL dump in XLSX format and that was about 150mb in size (where the 100 separate text-files were about 750mb in size).

XLSX is compressing the data, I guess.

I uploaded that single XLSX file (with more than 10 years of dedicated knowledge) and that did work.

So now that GPT is working, with 250.000 articles in 1 file.

I have 9 slots left, that’s enough for another 90 years, I guess…

Only drawback is very slow analyses; text-files are interpreted almost real time and XLSX is sloooooooooooooooow.

But the advantage is the structured data; it know not only know what data there is, but also in which row (entity) it should “label” it.

5 Likes

If your using Brave browser that’s why but the issue should be fixed now.

I’m having trouble adding x-openai-isConsequential to our schema. We don’t get an error, but the draft never seems to update the actual GPT.

After trying many different ways to get files to stick I zipped up multiple and then told it to unzip and analyze and that seems to have worked. Not sure on the limit of the zip but I put in over 15 files in one zip so far and good to go.