In the documentation for the Responses API you mention that you can pass through OAuth tokens. However, most remote MCPs I’ve found only support the OAuth flow where the user gets redirected and they need to manually enter their credentials at the start. How does this work in the Responses API? Is there a way to get redirected and have the Responses API automatically handle the OAuth flow?
Welcome to the dev forum @jloeshelle
The Responses API doesn’t manage authentication for you – instead, it lets you attach arbitrary HTTP headers to the MCP tool definition. These headers are forwarded with every call to the remote MCP server and can contain an API key or an OAuth bearer token. The header values are not stored by OpenAI and do not appear in the response; you must send the full server_url and the headers with every API call.
E.g., OpenAI’s docs show how to call the Stripe MCP server by setting the Authorization header to Bearer $STRIPE_API_KEY or a user‑scoped OAuth token.
If you are using the Responses API today, there is no automatic OAuth redirect flow. You (or your backend) must run the OAuth 2.x authorization code flow with the third‑party service to obtain an access token, then include that token in the Authorization header of your MCP tool definition. In other words, you – the developer – are responsible for getting and refreshing the user’s OAuth token and passing it through as a header. The current Responses API won’t present the end user with a login screen or automatically exchange authorization codes.
By contrast, when you connect a custom remote MCP server in ChatGPT via the “Custom connectors” feature, ChatGPT exposes an OAuth flow to the end users in your workspace. The docs explain that if you connect your own remote MCP server in ChatGPT, “users in your workspace will get an OAuth flow to your application.”
In that scenario, the user is redirected to your OAuth provider, logs in, and ChatGPT stores the resulting token for subsequent calls. This flow is currently only available in the ChatGPT connectors UI, not through the raw Responses API.
So, for now:
-
In the Responses API, you must acquire the OAuth access token yourself and supply it via the headers field in your MCP tool definition. There is no built‑in redirect flow, so the end user cannot directly authenticate through the API.
-
In ChatGPT (using custom connectors), end users can be presented with an OAuth login for your MCP server.
Here’s an example from the OpenAI cookbook which shows connecting to Databricks MCP servers.