Client.responses.create input < Client.chat.completions.create input

Is there a way around the 256K input limit for client.responses.create ? When using client.chat.completions.create , it was only limited by the context window. I’d like to switch to the newer API but keep hitting that 256K limit. I also can’t use any RAG-based workaround—my requirement is to stay within the context window only.

the specific error I am getting is:

Error code: 400 - {'error': {'message': "Invalid 'input': string too long. Expected a string with maximum length 256000, but got a string with length 351673 instead.", 'type': 'invalid_request_error', 'param': 'input', 'code': 'string_above_max_length'}}

The same length query is not a problem with client.chat.completions.create

1 Like

This is likely referring to the input field when using a single string, inferred to be a user message.

You might be able to send two separate role messages in constructing the input.

Here is the format for input using roles and types. I show two options for sending in parts, although you can start with testing just the existing string in this type of input.

1. Single user message containing two text input parts:

These texts should be appended, without even a line feed.

[
  {
    "role": "user",
    "content": [
      {
        "type": "input_text",
        "text": "What are the latest developments in AI?"
      },
      {
        "type": "input_text",
        "text": "Provide examples from the last 6 months."
      }
    ]
  }
]

2. Two separate user messages, each containing one text input part:

[
  {
    "role": "user",
    "content": [
      {
        "type": "input_text",
        "text": "Here is documentation for my question."
      }
    ]
  },
  {
    "role": "user",
    "content": [
      {
        "type": "input_text",
        "text": "Now, illustrate its potential applications."
      }
    ]
  }
]

Hope that gets this sorted until OpenAI realizes that 120k tokens can be 500k characters.

1 Like

thank you that worked, I used option one.

1 Like

Sorry about this. We’ll remove the 256k limit asap.

4 Likes