Need Help: "No tool output found for function call" Error After Function Call in Responses API

Hi all,

I’m working with the Responses API for my project and I’m encountering an issue after a function call.

Here’s how I’ve set up my tool definitions:

tools = [
    { "type": "web_search_preview" },
    {
        "type": "file_search",
        "vector_store_ids": ["store_id_1"],
    },
    {
        "type": "function",
        "name": "xxx",  
        "description": "xxxx",
        "strict": True,
        "parameters": {
            "type": "object",
            "properties": {},
            "required": [],
            "additionalProperties": False        
        }
    }
]

Here’s my API call logic:

api_arguments = {
    "model": self.model,
    "input": messages,
    "temperature": self.temperature
}

if tools:
    api_arguments["tools"] = tools

if previous_response_id:
    api_arguments["previous_response_id"] = previous_response_id

response = self.client.responses.create(**api_arguments)

After a function call, I am appending follow-up messages like this:

# Append the model's function call message
follow_up_messages = [output_item]

# Append the function call output
follow_up_messages.append({
    "type": "function_call_output",
    "call_id": output_item.call_id,
    "output": str(return_val)
})

logger.info(f"output_item: {output_item}")

# Request a follow-up response
follow_up_response = st.session_state.open_ai.get_llm_responses(
    follow_up_messages,
    nexi_tools, 
    None
)

The problem:
After the function call, when I try to input something new, I get this error:

*** Error occurred while question prompt: Error code: 400 - {'error': {'message': 'No tool output found for function call call_xxx.', 'type': 'invalid_request_error', 'param': 'input', 'code': None}}

It seems like the function call output isn’t being linked correctly, but I’m not sure what I’m missing.

Could someone please help me figure this out?

Thank you,
Binjan Iyer

I figure out the solution of the problem. I did not send back the result from function call. It does not have all the information. So, OpenAI Reponses API return error.

append only function result for follow-up response

# Append the function call output
follow_up_messages.append({
    "type": "function_call_output",
    "call_id": output_item.call_id,
    "output": str(return_val)
})

# Request a follow-up response
follow_up_response = st.session_state.open_ai.get_llm_responses(
    follow_up_messages,
    nexi_tools, 
    api_arguments["previous_response_id"]
)

I added what you suggested to my code and also managed the tool according to the official documentation. But this silly error:
*** Error occurred while question prompt: Error code: 400 – {‘error’: {‘message’: ‘No tool output found for function call call_xxx.’, ‘type’: ‘invalid_request_error’, ‘param’: ‘input’, ‘code’: None}}
still keeps happening. What’s frustrating is that it’s becoming more and more frequent.

The model may call tools repeatedly, so you have to chain the output of previous responses to the next one:

import OpenAI from "openai";
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

async function runConversation() {
  // 1️⃣ First request — send user/system messages
  let response = await openai.responses.create({
    model: "gpt-4.1",
    input: [
      { role: "system", content: "You can call tools to answer questions." },
      { role: "user", content: "Check my calendar and then message me the result." }
    ],
    tools: [
      { name: "check_calendar", type: "function", function: { parameters: {} } },
      { name: "send_message", type: "function", function: { parameters: { message: "string" } } }
    ],
    tool_choice: "auto"
  });

  // 2️⃣ Loop: resolve tool calls until no more
  while (true) {
    const calls = response.output.filter(o => o.type === "function_call");
    if (!calls.length) break;

    // Run all tools and build outputs
    const outputs = await Promise.all(
      calls.map(async (call: any) => {
        let result;
        if (call.name === "check_calendar") {
          result = { available: ["2025-08-15 10:00"] };
        } else if (call.name === "send_message") {
          result = { status: "sent" };
        }
        return {
          type: "function_call_output",
          call_id: call.call_id,
          output: JSON.stringify(result)
        };
      })
    );

    // 3️⃣ Follow-up request — chain with previous_response_id
    response = await openai.responses.create({
      model: "gpt-4.1",
      previous_response_id: response.id,
      input: outputs,
      tools: [
        { name: "check_calendar", type: "function", function: { parameters: {} } },
        { name: "send_message", type: "function", function: { parameters: { message: "string" } } }
      ],
      tool_choice: "auto"
    });
  }

  console.log("Final model output:", response.output_text);
}

runConversation();