⚠ Responses API is badly documented

OpenAI staff, please, take a look at your API Reference pages. I’ve detected that the Responses API is badly documented:

The current version says that all the fields in the request body are optional! In previous versions at least the fields input and model were required.

https://platform.openai.com/docs/api-reference/responses/create

That is because prompts were introduced.

If a prompt is supplied, it already has the input and model references.

Before input and model being optional, when supplied a prompt it would still require these fields, which was why now they are optional.

It’s not like we don’t need to supply any parameters, but that is another challenge, perhaps we now need to find a better arrangement for improving the docs on what you pointed.

For reference: Enhanced Prompt Management - #56 by kwhinnery

2 Likes

Hey @aprendendo.next, thanks for the advice. I think OpenAI’s staff could add a note in the documentation to explain this situation.

1 Like

Agreed, @sashirestela, especially when the example shared by @aprendendo.next itself shows a model parameter being passes within the API call, making it ambiguous as to whether it’s a required field or not.

1 Like

You are right @jai

1 Like

Is it correct the following statement when creating a Response object?

You must provide either the prompt field or the input + model fields, but not both at the same time.

That cannot be correct if the example in documentation is assumed to actually work:

response = client.responses.create(
    model="gpt-4.1",
    prompt={
        "id": "pmpt_abc123",
        "version": "2", ...

Does it have override behavior or ignoring behavior because the model is also supplied? One would have to create a single “prompt” and care to even answer for yourself. Would this then change after they figure out the policy and break any app? Or break after they push some other API change forgetting a complex interplay of validation rules?

Safest thing is to consider this originating from a “Google has a more appealing ‘playground’ with more encompassing presets, where’s ours?” edict - and completely ignore it exists for any other use.

When providing a prompt, you can also optionally provide a model to override it.

And if provided with an input, it will be appended to the conversation defined in the prompt template before running.

1 Like