Does Batch API support Responses endpoint with reusable prompts?

I’m running into a consistent validation failure when trying to use the Batch API with the Responses endpoint and a saved/reusable prompt.

What I’m doing:

  • Endpoint: /v1/responses
  • Each JSONL line looks like:
{“custom_id”: “artist_1”,“method”: “POST”,“url”: “/v1/responses”,“body”: {“prompt”: {“id”: “pmpt_xxx”,“version”: “9”,“variables”: {“artist_name”: “Example”,“artist_data”: “Some data”}}}}

The body works fine in single responses.create() calls.

Problem:
When submitted via the Batch API, the job fails validation with the error that input is missing. My understanding is that input is not required when calling a saved prompt — but Batch seems to treat it as mandatory.

Question:

  1. Can the Batch API actually be used with the Responses endpoint when the request body only references a saved prompt?
  2. Or is this currently unsupported, meaning Batch requires the input parameter regardless?

Would be great to get confirmation on whether this is expected behavior or a bug.

Thanks!

Carl

Sure, you might have a prompt id, but what is your API call asking the AI?

That should be your "input": field, that is failing validation on you.

Input is only optional on the Responses endpoint because conversation id with the input message is an alternative.

A “prompt” is a terrible name. It should be “settings” or “template”.

I have a similar problem in this topic. Instead of ”input”: (which I’m actually using), Batch API blames me for not using ”model:” parameter:

BatchError(code='invalid_request', line=1, message='Model parameter is required.', param='body.model')

My batches look like this:

{"custom_id": "311360", "method": "POST", "url": "/v1/responses", "body": {"prompt": {"id": "pmpt_XXX", "version": "35"}, "input": "XXX", "reasoning": {"summary": "auto"}}}

I have the newest openai version 1.107.3.

The problem is, when using reusable prompts, it’s pointless to submit body.modelon each individual batch request because I already chose a model when I was defining my reusable prompt and, what’s more, models may differ per prompt version in my case which makes things even more complicated. So the question is: is there a way to skip the parameter body.model or assign it anything like a default value?

1 Like

I can confirm that I’ve successfully used the batch API with a reusable prompt ID, using /v1/responses API.
You need an “input” (user input), and you can provide a reusable system prompt using the “prompt” object containing an “id” just like in the normal real-time API.

this is really confusing. the reusable Responses/prompts currently allow for templating in the prompt so we can pass in variables when we call these prompts via the standard responses.create api. But suddenly when it comes to the batch API, we can’t do that and should be providing user input?

this is an inconsistent paradigm in which the saved prompts are, in the batch api case, meant to be like “system prompts” but in the responses.create case, framed as encapsulating a prompt+user input together.