Hello everyone,
I’m integrating OpenAI’s Response API with the new webhook feature and I want to confirm an implementation detail.
According to the documentation, when creating a response, we can include a metadata
object to store custom data (e.g., database IDs, session info).
This metadata
is returned when fetching the response directly via API.
Question
Does this metadata
object also appear in the webhook payload when the response is completed and sent to my endpoint?
If not, is there any setting or option to enable it?
Example request
POST https://api.openai.com/v1/responses
{
"model": "gpt-4.1-mini",
"temperature": 1,
"max_output_tokens": 1000,
"store": true,
"background": true,
"input": [
{
"role": "user",
"content": "Hello"
}
],
"tools": [],
"tool_choice": "auto",
"metadata": {
"chatId": 123
}
}
What I currently receive on my webhook endpoint:
{
"id": "evt_345",
"object": "event",
"created_at": 1755216471,
"type": "response.completed",
"data": {
"id": "resp_123"
}
}
Is it possible to also receive the metadata
in the webhook payload?
Thanks!