Prompt variables are not being saved in the playground or being detected in code and lack of documentation on where to add the variables

I am using the Responses API on the playground which is supposed to support variables for the prompt. I have added a few variables as shown in the image but when I reload the page, the variables are gone, even after clicking on Update before reloading. Aren’t the prompt variables supposed to be stored for each prompt version?

Also, I couldn’t find a clear documentation on how to reference these variables in the actual system prompt or user prompt. I tried adding {{variable_name}} to the user prompt but when I make the API, I get an error saying that the variables are not found as shown in the image.

Please let me know if I am missing something here.

Here is a minimum code sample to reproduce this:

from openai import AsyncOpenAI

os.environ['OPENAI_API_KEY'] = 'your_api_key'

client = AsyncOpenAI()

response = await client.responses.create(
        model="gpt-5-2025-08-07",
        prompt={
            "id": "prompt_id",
            "version": prompt_version,
            "variables": {
                "conversation": conversation,
                "reference_summary": reference_summary,
                "predicted_summary": predicted_summary,
            },
        },
        input=[
            {
                "role": "user",
                "content": [
                    {
                        "type": "input_text",
                        "text": "Original conversation\n```\n{{conversation}}\n```\nReference summary:\n```\n{{reference_summary}}\n```\nPredicted summary (evaluate this):\n```\n{{predicted_summary}}\n```",
                    }
                ],
            },
        ],
        reasoning={"effort": "medium"},
        tools=[],
        store=True,
        max_output_tokens=4096,
    )

1 Like

Try removing the “version” parameter. It will use the default version.
In the prompt your syntax is correct, {{variable_name}} is how you use it, it will be highlighted in green when you match the correct name.

1 Like

Thanks for replying @aprendendo.next!
I have this working in the Playground but the same thing is not working in code.

The official docs never mention any issue with passing the version.

Have you been able to use variables in your python code that calls the API? If yes, can you share a snippet and the OpenAI library version that you are using?

1 Like

Sure! Here is how it is in the playground:

And here is the code:

raw_response = client.responses.with_raw_response.create(
    prompt={
    "id": "pmpt_68987ea1c4648193b453ce7fe5a163f808614079430f9289",
    # "version": "2",
    "variables": {
        "target_language": "spanish"
        }
    },
    input="hi",
    # max_output_tokens=3048,
    store=True
)
response=raw_response.parse()
# response
print(response.output_text)
# content
print(raw_response.http_request.content.decode('utf-8'))

A few things to notice:

  • I mentioned taking version out as a test suggestion, but forgot to say that in production you should use it.
  • Currently there is an ongoing issue where max_output_tokens is defaulting to 2048 when using prompts, if you have any problems set a higher parameter to fix it.
  • I think the problem might be that your variable usage is being sent in the input parameter, while it has to be in the stored prompt (playground). The input parameter is not parsed for variables.
2 Likes

I wonder if the issue is not saving the version of the “prompt” that has the variables added to it?

You got “unknown prompt variables” as an error - meaning they didn’t match up to what is expected within the saved prompt.

The Playground makes fake use of variables. It just replaces the strings itself and sends the resulting messages, never employing prompt ID nor variables.


Another form of “prompt” for your API parameter, sending two variables (with a more plausible use than the documentation examples).

prompt_obj = {
    "id": "pmpt_12341234",
    "version": "2",
    "variables": {
        "ai_role_name": {
            "type": "input_text",
            "text": "Marv"
        },
        "custom_instructions": {
            "type": "input_text",
            "text": "A sarcastic chat partner."
        }
    }
}

Ran RESTful. No SDK bug-of-the-day.

1 Like

As @_j rightly pointed out, please make sure to update the prompt and either retrieve the version number or set the new update as the default.

A code snippet will be generated, but unfortunately, it will only be available until the popup is closed. :man_facepalming:

I added “by prompt v2” as a way to verify if the correct version was being used - you can try something similar during your initial tests.