My experience was the same - I can’t get Deep Research to actually user the data I return from the search tool. I can see in my logs that it’s passing data but ChatGPT acts like I’m speaking gibberish to it… but I don’t know why!
Some more docs or some guidance here from OpenAI would be appreciated. We want this to work!
I’m also very disappointed with the VERY LIMITED implementation. Anyway I have another weirder problem: my MCP is working in assistant (‘search’ and ‘fetch’ correctly found as tools and working perfectly), but when added to Deep Research it says it doesn’t follow the specifications? How do the two differ?
Getting the red text error message next to the custom connector saying it doesn’t follow the specifications happens sometimes even when the connector does follow the spec. If it does actually follow the spec, it will still work when doing Deep Research though!
Actually I can confirm, you go on Edit and there is all fine, reloading the page it makes it then appear in Deep Research. An incredibly fragile and limited MPC implementation is what they did.
@sobannon in your working example you talk about implementing a custom OAUTH for various reasons…any more light on it? I’m trying to implement a simple OAUTH with Clark, works perfectly on Claude and locally on Cursor, but ChatGPT after being called back from Clerk after authorisation simply stalls for a few seconds on a blank page and then says there was an error. I see on my side that it never calls back to my endpoints after reaching https://chatgpt.com/connector_platform_oauth_redirect
What are the reasons you needed to implement a custom OAUTH?
This seems based on the example code, and is incorrect I think. According to the schema OpenAI publishes there are multiple required return fields like id, title,text…
I’m having this exact experience. I’ve finally got it to accept my response to “tools/list” (by parroting nearly exactly THEIR tool definitions from the documentation) but then when they actually send me a “tools/call” and I respond with JSON that exactly follows the schema, their deep research agent acts as if it can’t understand what I sent. And their MCP client closes the connection abruptly. I think it must have to do with something about the format or structure of the JSON I’m sending back. Any progress on this issue?
So, using @sobannon’s example, I was actually finally able to get ChatGPT to accept the results I was sending back. Anyone still having trouble, I’d suggest looking at that implementation of the search and fetch tools for guidance. Thanks @sobannon!
@OpenAI_Support are there plans to support custom connectors beyond search and fetch verbs. I would like to have it run any mcp tool that the custom connector exposes. Also will it come to plus users?
Disappointed on the lack of guidance on how exactly search and fetch work along with abrut end to the calls without much details. Btw fixating on search and fetch tools breaks the whole point of building a mcp server.
I have now spent hundreds of hours trying to get a fastapi mcp server to work with custom connectors and while i see in my mcp server logs all the json responses with the correct input and output schema matches, i see initialize, tools/list and tools/calls schema correct i am still unable to get it working. I see that @sobannon example but honestly not being a developer myself i dont quite understand the whole project. There doesn’t seem to be much good documentation about this. Essentially i wanted to use my local mcp server to search and fetch documents from my local rag where i ingested research notes into funny is that at one point i did have it working with custom connectors and deep research was able to to search notes in my rag and deliver a report but then something broke, i deleted the connector and unable to add it back again. If anyone has been able to get a fastapi mcp remote server working with streamable-http instead of sse that would be great. i have reached out to OpenAi support of course they want logs.har files etc which i will provide but i figured i would reach out also to the community. the below was sent via Claude Sonet 4.0 and confirmed also by ChatGPT 4.1.
OpenAI MCP Compatibility
Your current approach is more explicit and robust than the FastMCP minimal example.
Your initialize response is fully compliant (capabilities, serverInfo, protocolVersion).
Your tools/list response is best practice (full schemas, not just IDs).
Your tools/call for search should return full objects, not just IDs.
Streaming HTTP is preferred over SSE for new deployments.
Is this available for plus users? Following this thread and the docs it isn’t clear to me. Also not seeing the custom or create options in connectors
EDIT:: As of this doc update a couple of days ago it still doesn’t look like custom connectors have been enabled for Plus users. Interesting decision to not support custom connectors for Plus users when Claude gives custom mcp connectors to even free tier users.
Custom connectors are only available for Teams accounts, and ONLY for team administrators. Not available for plus/pro. If you accept a team invite, you can’t manage custom connectors either – you have to be the team admin.
I believe I read specifically this was removed for Plus users, so I wouldn’t expect it to be added again for a while at least.
My biggest problem is the lack of support for arbitrary tools calls. MCP specifies how to fetch tools, ChatGPT calls this, but only support search and fetch.. why?! I’m curious if OpenAIwill support arbitrary tools in the future.
Super disappointing with the limited mcp support while Claude continues to make constant updates for custom connectors. Is this going to be available beyond search/fetch within the next month or should I plan to move my team over to Claude?
For everyone else, you are better off creating a simple n8n ai agent and connecting it via Actions in a customgpt.