Maximum Context Length Error across different models

Hey! Can you give me an example of the prompt that is causing this?

Also, to be clear, every model has a different context window length. GPT-4 right now is defaulted to 8k, GPT 3.5 Turbo 16k has 16k, and GPT-4 Turbo has 128k. So if it worked with 4-turbo, that does not mean it will work with the other models since they have lower context.