Hello everyone. I’m little late to the party, but still.
From what I understand, Responses API has no such thing as system instructions (the prompt you feed to the agent on creation). If I want to customise the “prompt” in the new Responses API, best I can do is feed the instructions for the model as the initial message in the context? But isn’t that so, so much worse than actual system instructions?
Some clarification: I use dynamically created assistants in my project that get fed agent prompt (system instructions), and the assistants get used for huge conversations without losing their instructions and customisation. It’s all achieved through API and not dashboard, which is great. I can edit the agent prompt on the fly and re-inject context to not lose the conversation history.
With assistants API, I don’t see how that’s possible, since in the migration guide it explicitly states that you should “recreate an Assistant’s instruction + tool bundle” for each existing assistant IN THE DASHBOARD
. And what if I have hundreds of them, dynamically created, deleted and edited? That’s like the dumbest “feature parity” you can imagine. I hope I’m getting something wrong here.
My use case is very simple (glorified multi-character chatbot in essence), but assistants API was the a great solution for all my needs. Now that it’s being deprecated, I have no idea how to migrate my functionality to new API and replace the features I use with something that (apparently) doesn’t even exist in Responses API.
To test how it would play out, I went to dashboard to view my assistants and opened one. In the migration guide it says:
Identify the most important assistant objects in your application.
Find these in the dashboard and click Create prompt.
I didn’t even see the “Create prompt” button.
Worst of all, OpenAI decided to ditch API-only approach, as per the migration guide:
Assistants were persistent API objects that bundled model choice, instructions, and tool declarations—created and managed entirely through the API. Their replacement, prompts, can only be created in the dashboard, where you can version them as you develop your product.
I’m sorry if I didn’t do enough research, but I can’t wrap my head around how I should migrate to that new API. Or maybe the best solution is to migrate to another LLM provider xD
