This question is hopefully best answered by someone at OpenAI, though I’d appreciate insights from anyone who understands this design decision.
From the migration docs (Assistants → Responses), we have:
“Assistants were persistent API objects that bundled model choice, instructions, and tool declarations—created and managed entirely through the API. Their replacement, prompts, can only be created in the dashboard, where you can version them as you develop your product.”
I don’t see any technical reason why prompts couldn’t support CRUD operations and versioning through the API. Why restrict this functionality to the dashboard only?
Previously, developers could create Assistants entirely via API. Having parity here—being able to create, update, and manage prompts programmatically—feels both expected and logical. As it stands, this change makes adoption and management of Responses with Prompts more manual and painful.
Is this simply a matter of time before API support for Prompts is added, or is the dashboard-only approach intentional and permanent? Help me understand the rationale here.
I am also a little confused by the documentation for the new responses prompts as (1) I cannot find the prompts dashboard in my API admin panel. I’m not sure if it’s rolling out over time, or what - but I received an email that assistants are being depreciated but apparently do not have access to prompts (that I can find).
And (2), as you note, the documentation says “Their replacement, prompts, can only be created in the dashboard”, yet it appears in my initial testing that you can provide inline prompts directly in your code like so:
text = "some task to give to gpt"
def function_name(self, text, apikey=None):
if apikey:
self.client = OpenAI(api_key=apikey)
resp = self.client.responses.create(
model=self.model,
input=[
{"role": "system", "content": "PROMPT INTRSTRUCTIONS HERE"},
{"role": "user", "content": [{"type": "text", "text": text}]},
],
)
return resp.output_text
Which, in my mind, contradicts the “can only be created in the dashboard” documentation.
This transition has been very confusing.
EDIT: After looking some more, it appears that “prompts” in the documentation may refer to “chats” in the dashboard. Why can’t the terminology match between the documents and the dashboard?
2 Likes
Do you not have this page ( see screenshot)? https://platform.openai.com/chat
The prompts that you create should show up here.
Renaming Assistants to Prompts has been a step backwards and confusing. They should have chosen Agents ( which is what Responses is all about - agentic behavior.)
It’s helpful to know that you can send system instructions ( {“role”: “system”, “content”: “PROMPT INTRSTRUCTIONS HERE”},).
I need to check if its possible to add the additional prompt properties such as Tools etc. inline in the python request. If that’s possible to do, then I would have to question why Chat Prompts are useful as they need manual creation in the openai dashboard.
1 Like