drug
1
What is the current text limit for creating knowledge for Custom GPTs? I tried to do it and get this error : This file contains too much text content. Please try again with a smaller file.
1 Like
How large was the file you tried to upload? I think I saw a limit on a total of 10 files, but I havenāt seen any file size limits discussed yet.
drug
4
1 File, 10MB, however theres alooooooooooot of text inside.
They said that the context limit was 128k on GPT-4 Turbo
Yeah, but that is just the request-by-request context token limit. The files that you upload are presumably translated to embeddings and managed in something akin to a vector DB. There has to be some sort of cap/limit on that, but i havenāt seen anything mentioned as of yet.
You could try a ābinary searchā deduction approach. Keep cutting the file size in half until it finally accepts that size. Then, break the original file into chunks of that accepted size. Itās a kludgy workaround, but it might keep you moving forward while all of this shakes out.
drug
8
Haha yeah i will try to do that. Meanwhile i have created a free tools that let u extract your website as the custom GPTs knowledge file ! Feel free to use it, its free !
filescrapper .com
P0mme
9
Hey,
Has anyone figured it out? Iām having the same problem, even with very small files. 
āThis file contains too much text content. Please try again with a smaller file.ā Ugh.
I get it too with a 22MB file. I need to do data analysis on it so context should not matter.