Issue with responses api sending varibles

Hello,

I’m trying to use ChatGPT API prompts with responses api with the gpt-4o-mini model. I added a variable that I’m using in the prompt body.

However, when I make the API call, I receive the following error:

Response input messages must contain the word “json” in some form to use text.format of type json_object.

The input variable I’m using is named form_data_json.

Below is the curl command I’m using:

$payload = [
“prompt” => [
“id” => “pmpt_6954”,
“version” => “2”,
“variables” => [“form_data_json” => $ai]
],
“input” => 
,
“text”  => [
“format” => [
“type” => “json_object”
]
], “reasoning”         => new stdClass(),
“tools”                => [
[
“type” => “file_search”, “vector_store_ids” => [
“vs_695”
]
]
], “max_output_tokens” => 2048, “store” => true
];

$ch = curl_init("https://api.openai.com/v1/responses");

curl_setopt_array($ch, [
    CURLOPT_RETURNTRANSFER => true, CURLOPT_POST => true, CURLOPT_HTTPHEADER => [
        "Content-Type: application/json", "Authorization: Bearer " . CHAT_GPT_KEY
    ], CURLOPT_POSTFIELDS  => json_encode($payload, JSON_UNESCAPED_UNICODE)
]);

$response = curl_exec($ch);
curl_close($ch);

echo $response;

Here’s what’s happening.

You are requesting a response format by the “text” parameter. You are using “JSON mode”, by the text.format:“json_object”, saying to the API that you only want and expect a JSON structured response.

json_object doesn’t happen magically, and this older mode doesn’t take a schema (like were you to instead specify a type of “json_schema”). You must provide excellent developer message prompting of what kind of JSON the AI model should produce, by example, by instructions, by your own schema in instructions field.

The API has recognized that there is not even the word “JSON” appearing in a first developer message or within instructions. That can cause loops of whitespace were it to be allowed and were JSON under-specified. Thus, the request is rejected. The name of a variable doesn’t count towards this requirement, as the AI can’t see that metadata.

Key: either

  • switch to “type”:“text” for allowing any type of “chat” response (JSON can still be requested)

  • switch to “type”:“json_schema” and include a robust schema for “strict” enforcement of keys to be reproduced.

  • Write “JSON” all over your prompt if you want the API call to “go” as is.

  • gpt-4o-mini is not a great model for reliably following output format instructions. A bunch of search returns from function calls can distract it.

PS: json_object and tools can be part of a prompt, instead of specifying in your API call, especially if using a static per-prompt vector store.

Extra PS: the platform site UI for making such a prompt is broken. You would need to write example JSONs as multi-line and with tab indents to match the “json_object” expectation, but <tab> will move you out of the instruction field in the browser. You must paste. Then, a test response JSON is damaged in the platform site, not providing the actual plaintext when it is recognized as JSON.

I’ve made such an (unsaved) prompt as example of instructed JSON, also with constrained sampling parameters:

The API model responds (variables are simulated there):

Hope you can figure out if you really wanted JSON to be produced, or if you need to prompt the AI in a different way via system message.