GPT4 context length has been much worse the past few days

Hi, I was wondering if a new model released or some filter is being put up for GPT4 that would cause text input to be checked differently?

In the past I was able to easily just copy and paste 20pages or text or more without thinking about it, but now for 4 pages of text or 500 lines of code I just keep getting this: “The message you submitted was too long, please reload the conversation and submit something shorter”.

This is being such a problem for me that now I have to run open source models just because of context length problems that gpt such be able to handle easily.

1 Like

Hi and welcome to the community!

The maximum length of input tokens is unfortunately varying with demand.
If it’s an option for you can try working with the playground:

There you can set the total length of the input message and reply as you need, via the max-tokens parameter.

Unfortunately it’s a different product from ChatGPT Plus, but if you can spare 5$ to unlock the GPT 4 access it may be worth a try.

Hope this helps.

I’m talking about the length of the input you can put into the prompt not the output length. I don’t see any option in any of the playgrounds for changing a input max_tokens parameter and don’t really see this as a good solution for something that was already a feature and literally says 128,000 tokens in the model description.

1 Like