I’m currently integrating with the OpenAI Responses API to retrieve the full context of a conversation for analysis and storage. I’m using the following endpoint:
/v1/responses/{response_id}/input_items
When calling this endpoint with the latest response_id
from a conversation, the object returned does not include the most recent text response generated by the model. Instead, it appears to contain only the historical context up to the last input, excluding the final output.
How can I programmatically retrieve the full historic conversation, with all input-output pairs (not just inputs), including the latest model response?