It would be great if the Remote MCP feature in the Responses API called the MCP server from the client instead of the server, to access internal MCP servers.
My company hosts MCP servers inside of a private VPC, so we can’t use this feature as OpenAI’s servers can’t reach them.
1 Like
As an alternative, you could use a custom function in tools parameter to allow the API to ask for context, which you provide by acting like a proxy to your internal infrastructure.
Since the remote API can’t acces your MCP server, you’d have to respond in multiple turns back and forward.
Just curious, does you company use any major services other than openai that allow that feature you mention?
We also use LangChain in some places, which has an MCP integration on the client-side.
1 Like