/v1/responses with background:true does not append I/O to conversation (works when background:false)

When calling the Responses API with a valid conversation (string or { id }) and background:true, the request completes but the “user input” and “assistant output” are not appended to that conversation.

If I change only background to false, the items are appended as expected.

Environment (my setup)

  • Endpoint: POST /v1/responses with conversation (string or object)

  • Option causing issue: background:true

  • Model: gpt-5

  • Storage: store:true

  • Context: I am not using previous_response_id in the same request

  • Same API key, same conversation ID across the A/B tests

1 Like

I’m seeing the same issue, but with model gpt-4o.

When I call POST /v1/responses with a valid conversation and background:true, the response completes but the input/output are not appended to the conversation. If I switch to background:false, they are stored correctly.

Did you manage to resolve this on your side, or is this still an open bug?

I’m getting this as well. I tried another way but the problem with background: false is that if the stream gets cut, you can’t get these items from the /conversations/items endpoint either. So there’s effectively no way to have a long running task with the conversations API.

Hi there! I just pushed a fix for this today, so conversations should now get items from responses when run in background mode. Please let me know here if you are still seeing issues, and thanks for your patience on our delay in fixing this issue!

2 Likes

Great work!
During this period, I have been using background=False consistently.

The conversation feature is quite similar to the pre_responseId functionality. I have tried both features, but conversation is more convenient for topic management.

However, both features have the same BUG: If a call involves both code_interpreter and reasoning, the next call will inevitably result in an error. The error message is: “400 Bad Request Item ‘rs_…’ of type ‘reasoning’ was provided without its required following item” error in Responses API.

I originally thought that switching from pre_responseid to conversation would resolve this issue, but the problem persists.

Please pay attention to this matter. Thank you.

1 Like

Hello! Thank you for taking a look at this! I’m testing it out now and there’s two different bugs I’m seeing with by just switching to background: true and combining the conversations id.

  1. Duplicate item

If I let the stream finish, it inserts a duplicate conversational item in the list conversation items endpoint. This blocks follow up responses with the same conversational id.

LIST:

{
  "object": "list",
  "data": [
    {
      "id": "msg_68c360b3180481949ecb04875a722f6c0b177e8dd89b18ed",
      "type": "message",
      "status": "completed",
      "content": [
        {
          "type": "output_text",
          "annotations": [],
          "logprobs": [],
          "text": "Hello! How can I assist you today?"
        }
      ],
      "role": "assistant"
    },
    {
      "id": "msg_68c360b0fa74819480747fed468fc66b0b177e8dd89b18ed",
      "type": "message",
      "status": "completed",
      "content": [
        {
          "type": "input_text",
          "text": "hello"
        }
      ],
      "role": "user"
    },
    {
      "id": "msg_68c360b3180481949ecb04875a722f6c0b177e8dd89b18ed",
      "type": "message",
      "status": "completed",
      "content": [
        {
          "type": "output_text",
          "annotations": [],
          "logprobs": [],
          "text": "Hello! How can I assist you today?"
        }
      ],
      "role": "assistant"
    }
  ],
  "first_id": "msg_68c360b3180481949ecb04875a722f6c0b177e8dd89b18ed",
  "has_more": false,
  "last_id": "msg_68c360b3180481949ecb04875a722f6c0b177e8dd89b18ed"
}

Follow up response:

{
  "error": {
    "message": "Duplicate item found with id msg_68c360b3180481949ecb04875a722f6c0b177e8dd89b18ed. Remove duplicate items from your input and try again.",
    "type": "invalid_request_error",
    "param": "input",
    "code": null
  }
}

  1. If you do NOT let the stream finish in the original response, the background storing does work (thanks!), however it doesn’t persist the original user message:
{
    "object": "list",
    "data": [
        {
            "id": "msg_68c361b5c3848197b33695a8fb0061c30044df19232d9ce9",
            "type": "message",
            "status": "completed",
            "content": [
                {
                    "type": "output_text",
                    "annotations": [],
                    "logprobs": [],
                    "text": "Sure! Here's a short story for you:..."
                }
            ],
            "role": "assistant"
        }
    ],
    "first_id": "msg_68c361b5c3848197b33695a8fb0061c30044df19232d9ce9",
    "has_more": false,
    "last_id": "msg_68c361b5c3848197b33695a8fb0061c30044df19232d9ce9"
}

It will allow for follow up responses, but it does not have memory of what the user request was. I confirmed that by asking if it remember, so it’s not just a problem with the LIST API.


Thank you for taking a look at this.

Hi there! Great find, I have a fix coming for this that I hope to have out for you today. I will update here once it should be working as expected.

Okay, the issue with background + streaming should now be fixed. Let me know if you see any more edge cases, happy to address. Thank you.

3 Likes

Thank you for your work on this.
We just ran a quick test, and it seems the issues have all been resolved.
This is exciting—we can re-enable pre_response_id and conversation.
This will save us a lot of time.
Thank you very much!

2 Likes

This is working perfect now! Thank you so much. The conversations API is really fun to build with, nice work on it.