Assistants API to Responses API migration - API-only assistant creation and system instructions

Hello everyone. I’m little late to the party, but still.

From what I understand, Responses API has no such thing as system instructions (the prompt you feed to the agent on creation). If I want to customise the “prompt” in the new Responses API, best I can do is feed the instructions for the model as the initial message in the context? But isn’t that so, so much worse than actual system instructions?

Some clarification: I use dynamically created assistants in my project that get fed agent prompt (system instructions), and the assistants get used for huge conversations without losing their instructions and customisation. It’s all achieved through API and not dashboard, which is great. I can edit the agent prompt on the fly and re-inject context to not lose the conversation history.

With assistants API, I don’t see how that’s possible, since in the migration guide it explicitly states that you should “recreate an Assistant’s instruction + tool bundle” for each existing assistant IN THE DASHBOARD :person_facepalming: . And what if I have hundreds of them, dynamically created, deleted and edited? That’s like the dumbest “feature parity” you can imagine. I hope I’m getting something wrong here.

My use case is very simple (glorified multi-character chatbot in essence), but assistants API was the a great solution for all my needs. Now that it’s being deprecated, I have no idea how to migrate my functionality to new API and replace the features I use with something that (apparently) doesn’t even exist in Responses API.

To test how it would play out, I went to dashboard to view my assistants and opened one. In the migration guide it says:
Identify the most important assistant objects in your application.
Find these in the dashboard and click Create prompt.

I didn’t even see the “Create prompt” button.

Worst of all, OpenAI decided to ditch API-only approach, as per the migration guide:
Assistants were persistent API objects that bundled model choice, instructions, and tool declarations—created and managed entirely through the API. Their replacement, prompts, can only be created in the dashboard, where you can version them as you develop your product.

I’m sorry if I didn’t do enough research, but I can’t wrap my head around how I should migrate to that new API. Or maybe the best solution is to migrate to another LLM provider xD

1 Like

Hello, welcome to this forum.

The guide is a bit confusing, but the link to create prompts can be found here (create > Chat).
image

But perhaps another approach would be to start over into the responses API starting guide from scratch. It might be less overwhelming to start over than trying to make a whole sense into something you barely know yet.

I recommend being patient, there is still time.

Wish you good luck with your migration.

1 Like

You are right — there is no “Create prompt” button! I did go to the Prompts tab on the dashboard and created a new prompt, but it required me to re-add the vector stores, instructions, and models all over again. How is that a straightforward migration?

Further, the concept of “conversation” seems to exist only to carry forward legacy threads. For example, the code snippet is:

conversation = openai.conversations.create(items=items)

From what I can tell, the only reason to use conversations is to convert an existing Assistant thread.

Now, in the OpenAI dashboard we have “Completions,” “Responses,” and “Conversations.” What was the need to introduce “Conversations” if it only holds converted legacy threads?

From my perspective, this creates unnecessary confusion. I run a SaaS platform that lets users provide their OpenAI Assistant IDs, and we generate code for embedding them into their websites. When I heard about the deprecation of the Assistants API, we built support for the Responses API. The Responses API is significantly faster than the Assistants API, and all it really needs is support for bundled tools (RAG/functions), which was not a big deal to roll into my platform.

The Responses API already logs chat conversations under the Logs tab, which I’ve found sufficient. Honestly, I’m not sure who would find the “Conversations” tab useful or plan to use it. If you do, I’d love to understand your use case. Among thousands of users on my SaaS platform, only one has ever requested this.