Cannot connect to my custom MCP Server

Trying out the new MCP support. I have a custom MCP server up and running and have tested it with npx @modelcontextprotocol/inspector and it works perfectly. It does not use any authentication. When I try to setup the connection in the playground it seems to take a long time to connect to the MCP server (if it really is, not sure) and then goes to the page to list the tools and errors out with “Unable to load tools”.

I have switched the MCP server transport type from sse to streamable http without any change in behavior.

Anybody have any ideas what I may be doing wrong?

Hey there and welcome to the forum!

Do you have any screenshots of the output produced on either end when attempting this? Any way we could see some of the code for the MCP build or the shell scripts used to run the server?

There could be a plethora of issues that may be causing this. My intuition tells me this might be more of a networking config problem than an error in the server’s code itself, but I can’t verify that without seeing more details.

Have you verified you can ping/reach the MCP server from outside your local network first?

2 Likes

Here is a video showing it working with mcp inspector and then failing in the playground.

Here is the code of the mcp server

from fastmcp import FastMCP
mcp = FastMCP(name="hello-test-server", require_session=False)

@mcp.tool()
def hello_world() -> str:
    """Return the text 'hello world!'."""
    return "hello world!"

mcp.run(transport="streamable-http", host="0.0.0.0", port=8000, path="/mcp")

This is served up on a small digitalocean droplet running Ubuntu, and port 8000 is open via ufw

for some reason it is not letting me post a link to the video grrrr

Working with mcp inspector

Failing in the playground

Is there anywhere where we can check the MCP connection errors? Ive setup an MCP server on AWS Fargate, and am able to connect when executing a test from my client (locally), but when I attempt to add this MCP server within Open AI tools, I get the same “Check the server URL and verify…” error message.

For example, here is my local test connection which works:
import asyncio
from fastmcp import Client

client = Client(“http://xxxxxxx:8000/mcp/”)

async def call_tool(name: str):
async with client:
result = await client.call_tool(“greet”, {“name”: name})
print(result)

asyncio.run(call_tool(“Ford”))

MCP servers must be secure HTTPS endpoints. HTTP is not accepted by OpenAI. Error message is always the same, so difficult to figure out.

Hence, I used AWS App Runner, which gives a https endpoint by default, rather than setting up individual services on AWS such as ECS, Load balancer, etc…

1 Like

The requirement for https really should be called out or even pre-populate protocol suffix so users know, as I just wasted hours on this before finding this post.

2 Likes

For those looking for a quick way to test with local projects without having to manually setup HTTP certificates or DNS entries, checkout cloudflared, which provides one line setup in seconds.

I would post the link but apparently I can’t because that would be too useful, silly form rules, so you’ll have to search for it.

In case it helps.. I got frustrated with modelcontextprotocol/inspector as well as not being able to find simple site where I could test a remote MCP server so I developed my own. I put it - for now - at MCP Server Testing - mcp.implement.ai - in case you want use/to test it out.

PS. It does work with various SSE and HTTP servers but I don’t think I have it working on those with Oath yet. So best to not use authorization for testing, until you deploy.