How to choose a model in Response API's conversation?

So i want to try using response api’s conversation, here is the documentation: https://platform.openai.com/docs/api-reference/conversations/create

but it doesn’t include on how you choose the model you’re using.

1 Like

Hello and welcome to the Community!

It’s because we don’t select a model when calling conversations.create.
The endpoint is only used to store conversation state, such as items and metadata. The model is selected when calling responses.create. That request includes the model field and can be linked to an existing conversation using the conversation parameter.

Example flow, first create a conversation, then generate a response with a model:

  1. Create a conversation, no model here
curl [https://api.openai.com/v1/conversations](https://api.openai.com/v1/conversations)
-H “Authorization: Bearer $OPENAI_API_KEY”
-H “Content-Type: application/json”
-d ‘{
“items”: \[
{ “type”: “message”, “role”: “user”, “content”: “Hello!” }
\],
“metadata”: { “topic”: “demo” }
}’
  1. Generate a response with a model, tied to that conversation
curl [https://api.openai.com/v1/responses](https://api.openai.com/v1/responses)
-H “Authorization: Bearer $OPENAI_API_KEY”
-H “Content-Type: application/json”
-d ‘{
“model”: “gpt-4o”,
“conversation”: “conv_123”,
“input”: “Continue the conversation.”
}’

conversations.create only accepts items and metadata in the request body. There is no model field on that endpoint.
responses.create is the place to choose the model, and attach the response to an existing conversation by passing the conversation ID.

Hope that helps!

3 Likes

More clarification:

The conversations API is optional.

It is a server-side conversation state history storage, only useful in conjunction with the Responses API endpoint, where otherwise you’d send all the chat session messages of your own history for the user conversation, not just the latest input.

To use it, you must pre-create a conversation ID. Including a message in the conversation ID itself is possible and optional, but is not a likely pattern, as messages can also rotate out without observability when the history becomes too large for the model’s input.

Then: when the input message field on the responses ID “I have banana fingers” is sent along with that empty conversation ID, the input will be stored in the conversation ID along with the AI response “wipe your fingers” - for use in the same calling pattern again with your growing chat remembered.

In this unanswered topic of confirming how the API didn’t work right, a bug report which then had to be upgraded to an image message crashing conversations also, at the end, you have a nice block of Python code to chat with the friendly AI with response streaming.

2 Likes