ChatGPT can't read long prompts

I’ll post a prompt of, for example, 2,000 words, but the AI can only summarize the first 827 words. If I ask it about anything past that, it doesn’t know the answer. It says there’s no hard limit to how much it can read at once.

Can you provide a reference for that?

1 Like

Hi @dented.r33

There’a definitely a context length for ChatGPT and if your message exceeds that, it throws and error telling that the message is too long.

Consider breaking your message into chunks and then sending them as parts.

e.g part 1/3, 2/3, 3/3

2 Likes

Just from the AI’s resonse.

Do you know what an AI hallucination is?

If not then you have now seen one. ChatGPT created that to give you an answer. Notice that it did not provide any references. ChatGPT is not like a search engine such as Google that links to other pages for its results.

From: Introducing ChatGPT

ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers.


From: What are tokens and how to count them?

Tokens can be thought of as pieces of words. Before the API processes the prompts, the input is broken down into tokens. These tokens are not cut up exactly where the words start or end - tokens can include trailing spaces and even sub-words.

Depending on the model used, requests can use up to 4097 tokens shared between prompt and completion. If your prompt is 4000 tokens, your completion can be 97 tokens at most.

Try the tokenizer page to better understand tokens.

5 Likes

Hmm interesting. Does around 800 words seem like the limit you’ve experienced? I’ve never received this message.

Ah, I was afraid of that. Thanks for the input!

FYI

On sites that have likes image (the empty heart at the bottom of each post) just clicking the like

  • is quicker
  • shows others that this was a useful post
  • helps others learn who is giving out useful information

Also for those with notifications on, it doesn’t result in an audible on the device resulting in a break in thought to read the new post just to see the thanks. :slightly_smiling_face:

4 Likes

You might find that the first post in a new conversation will allow you to summarize longer text. ChatGPT is silently adding your chat history to each request in the background to give it context.

As the chat continues, this eats into your available tokens for the completion part.

Only the developers know the exact number, but I suspect the history grows until it hits about 1000 to 1500 tokens. Once it hits this limit, it drops the oldest interaction off the list to become a moving window.

There will be no history when you first start a conversation - so I think you may get a longer response. It is worth a go.

2 Likes

Makes sense, is there an official post about this. I would like to include this knowledge in a ChatGPT topic on another Discourse site but would like to have an official reference to go along with it. Helpful hints about using ChatGPT - Wiki - SWI-Prolog


Along the same lines have you tried this. Create a New chat on the first line enter proofread skip a line and then add something to proofread. Submit the prompt. Then edit the first prompt keeping the proofread but changing the text, and click Save & Submit. After repeating several times I find that it seems to get odder as if ChatGPT is remembering the info from the previous edits.

2 Likes

There is no documentation for how ChatGPT works behind the scenes.

This is based on how I would make the Chat have some context of previous interactions. This also explains why “continue” works as a prompt

1 Like

I can’t say for sure, and I haven’t found any Information on the exact context length of chatGPT from OpenAI.

However, given that chatGPT is a sibling model of InstructGPT, they could have similar context length.

Also consider this, if the book was released before ChatGPT training data cutoff date, it would have information on it or might have had the complete book as part of training dataset.

1 Like

I’ve heard rumors here and there of 8k tokens, but no way to know for sure from the outside… 8k seems reasonable if they’ve improved performance and old max was 4k and even earlier 2k…

2 Likes

8k sounds great - We get to burn through tokens twice as fast :wink:

Indeed. I tried to past a C++ program of about 8000 lines to have it explained by ChatGPT but I got this answer back and the model has no cue (seemingly) to the input constraints imposed on the GUI.
Even separating the program in 3 chunks was too much. It would be helpful to have somewhere a clear limit displayed to the user instead of the generic one saying that the input is just too long, thanks.

“The model has no clue”, typo, LOL

This is simply “abusing the model”.

What do you expect when you pass 8000 lines of code to an LLM?

Why not 80,000 lines? Or why not 800,000 ? Or send ChatGPT 8,000,000 lines of code and post when that fails as well?

Break up your code into small modules and submit small components to LLMs.

:slight_smile:

It would be more helpful to have CLEAR limits exposed, e.g. not more than 2000 characters per prompt or 500 words, etc.
What is actually the size of the attention span window?
I also said that I tried to chunk the code in 3 consecutive pieces. With proper instructions, I could have chuncked it in, say 10 or 20 pieces…

Best regards,
S-

1 Like

Use InstructGPT and not ChatGPT.

If it’s very long you can cluster the information and send it as batches.
We have such an incredible technology that requires just a little bit of understanding and legwork to operate efficiently. If you learn how to use it, you will be able to understand and summarize whatever you send it.

2 Likes

LOL, it’s unreliable after 100 lines. You tried 8000?

1 Like