How to inject function tool call messages to the Responses API?

Hi everyone,

I am trying to figure out how to use the Responses API to complete function calls, i.e., the function tool call messages (and the results) are already available beforehand.

I have the following script:

import asyncio
from dotenv import load_dotenv
from openai import AsyncOpenAI
from openai.types.responses import ResponseFunctionToolCall

load_dotenv()

client = AsyncOpenAI()

async def test_response():
    messages = [
        {'role': 'system', 'content': "You are a helpful assistant. You must call the functions 'get_weather' and 'get_population'. Before calling these functions, YOU MUST SAY 'Let me check the weather and the population for you...'."}, 
        {'role': 'user', 'content': "What's the weather and the current population in Paris?"}, 
        ResponseFunctionToolCall(arguments='{"city":"Paris"}', call_id='call_get_population', name='get_population', type='function_call', id='fc_get_population', status='completed'), 
        {'call_id': 'call_get_population', 'output': 'Population: 2.048 millions of people.', 'type': 'function_call_output'}, 
        ResponseFunctionToolCall(arguments='{"latitude":48.8566,"longitude":2.3522}', call_id='call_get_weather', name='get_weather', type='function_call', id='fc_get_weather', status='completed'), 
        {'call_id': 'call_get_weather', 'output': 'Temperature: 23.3 degrees Celsius.', 'type': 'function_call_output'}
    ]
    return await client.responses.create(model="gpt-4o", input=messages)

async def main():
    await test_response()


asyncio.run(main())

Launching this script, I obtained the following error:

openai.NotFoundError: Error code: 404 - {'error': {'message': "Item with id 'fc_get_population' not found.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}

As we can see, we have an item with ID fc_get_population in the list of messages, so the message indicates that the check is performed elsewhere.

Do you guys think we can make the Responses API work in this case?

Thanks a lot in advance for your discussion!

@atty-openai Could you please let me know if this is a bug or a feature? Thanks a lot!