Unable to parse 22kb txt file to a CSV on GPT4. Every type of request has become super slow

Is something wrong with the servers? I’m unable to do simple tasks without errors and everything is very slow, when it was flying before.

For example, I have a list of terms and meanings on a 480 line txt file, only 22kb and I asked ChatGPT-4 to parse it and put the term in a column and the meaning in another. It parsed 10 lines and returned an error that says “The content is too extensive”. In contrast, I recently had it divide a 27MB CSV into 12 files, it did it in a few minutes.

After that, I tried pasting it into ChatGPT-3.5, it appeared to do it quickly, after looking at the results, it added 65 terms and definitions to it. I asked it to parse it again with only the only what was within my uptick separated list and returned “error analyzing”.

I decided to upgrade from the “plus” account to the team account because of the “Higher message caps on GPT-4” to see if that helped, but the results are the same.

The AI has a limited context window, the amount of text it can observe and act on itself. Any documentation is very similar to what you type, just another part of the input to the AI model.

Additionally you must focus on what external tools are being used by the AI. If, for example, you are uploading a file and then asking for it to be split, the AI is likely just writing code for its Python sandbox that performs file operations, and the AI never sees the contents.

There is a limited amount of text that can be returned from this Python environment at once for the AI to consider, 32k text characters. The AI won’t be able to work with, expand on, or write additional metadata for the whole file.

You would have to approach this from a programmatic viewpoint. A type of instruction that would work within the AI cognition limits: "A file mydata.csv has been uploaded to the python sandbox mount point. You will send a script to python to retrieve the first 10 lines, then with your understanding of those lines, write a new script that will create and append to a new file mydata_out.csv, where you write (your processing) as replacement lines. Then iteratively continue this task until all lines of the original file have been rewritten into the new file".

1 Like

Is 32k just based on using the Python coding environment, but not for the data analyzation? So for example, if I upload 5 or 10mb CSV’s of statistics and asked GPT4.0 to find patterns or trends, it could consider all of the data present in the files?

Thanks for your help!

You can have the AI write large processing tasks as long as they are done in under 60 seconds. You could have it resort the entire file and rearrange the columns just by the code it wrote.

However you would need to be clever about the lack of being able to perceive the whole thing at once by AI. Like have ChatGPT write a function that creates a total of every entity found within a column by code.