OpenAI().responses.create() failure for https MCP tool behind nginx

This post is similar too, but not resolved by any of the previous topics here.

I have an MCP server implemented with fastmcp behind https via NGINX. I can access the tool list directly via fastmcp from any machine on the web with tools = await client.session.list_tools() and the URL of the form not_a_linkhttps://xxx:8443/mcp.

Here is how I start up the fastmcp server

mcp.run(transport=transport, host = '0.0.0.0', port = 8080, stateless_http=True)

if I don’t use stateless_http it also doesn’t work

When I pass the tool to openai.OpenAI().responses.create() I get back (I have replaced my actual machine name with xxx in all the output) 
Traceback (most recent call last):
  File "/Users/barrysmith/Src/petsc_mcp_servers/petsc_code_generator_mcp_server.py", line 33, in <module>
    asyncio.run(main())
    ~~~~~~~~~~~^^^^^^^^
  File "/usr/local/Cellar/python@3.13/3.13.11_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ~~~~~~~~~~^^^^^^
  File "/usr/local/Cellar/python@3.13/3.13.11_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/usr/local/Cellar/python@3.13/3.13.11_1/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/base_events.py", line 725, in run_until_complete
    return future.result()
           ~~~~~~~~~~~~~^^
  File "/Users/barrysmith/Src/petsc_mcp_servers/petsc_code_generator_mcp_server.py", line 27, in main
    response = oai_client.responses.create(model = "gpt-5-mini", tools = tools, input="What does the PETSc function KSPSolve do?")
  File "/usr/local/lib/python3.13/site-packages/openai/resources/responses/responses.py", line 866, in create
    return self._post(
           ~~~~~~~~~~^
        "/responses",
        ^^^^^^^^^^^^^
    ...<40 lines>...
        stream_cls=Stream[ResponseStreamEvent],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/openai/_base_client.py", line 1294, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/openai/_base_client.py", line 1067, in request
    raise self._make_status_error_from_response(err.response) from None
openai.APIStatusError: Error code: 424 - {'error': {'message': "Error retrieving tool list from MCP server: 'PETSc_Compile_Run_MCP_Server'. Http status code: 424 (Failed Dependency)", 'type': 'external_connector_error', 'param': 'tools', 'code': 'http_error'}}
```
where my tool is provided as
```
 tools=[{"type": "mcp", "server_label": "PETSc_Compile_Run_MCP_Server", "server_url": "https://xxx:8443/mcp", "require_approval": "never"}]
```

I checked the logs of the fastmcp server and it seems it never receives the request from OpenAI for the tool list. Here is the response using curl to access the url so it seems all is good with https and ssl certificates.

$ curl -Iv https://xxx:8443/mcp
* Host xxx:8443 was resolved.
* IPv6: (none)
* IPv4: 67.184.144.11
*   Trying 67.184.144.11:8443...
* Connected to xxx (67.184.144.11) port 8443
* ALPN: curl offers h2,http/1.1
* (304) (OUT), TLS handshake, Client hello (1):
*  CAfile: /etc/ssl/cert.pem
*  CApath: none
* (304) (IN), TLS handshake, Server hello (2):
* (304) (IN), TLS handshake, Unknown (8):
* (304) (IN), TLS handshake, Certificate (11):
* (304) (IN), TLS handshake, CERT verify (15):
* (304) (IN), TLS handshake, Finished (20):
* (304) (OUT), TLS handshake, Finished (20):
* SSL connection using TLSv1.3 / AEAD-AES256-GCM-SHA384 / [blank] / UNDEF
* ALPN: server accepted h2
* Server certificate:
*  subject: CN=xxx
*  start date: Jan 30 15:49:46 2026 GMT
*  expire date: Apr 30 15:49:45 2026 GMT
*  subjectAltName: host "xxx" matched cert's "xxx"
*  issuer: C=US; O=Let's Encrypt; CN=E7
*  SSL certificate verify ok.
* using HTTP/2
* [HTTP/2] [1] OPENED stream for https://xxx:8443/mcp
* [HTTP/2] [1] [:method: HEAD]
* [HTTP/2] [1] [:scheme: https]
* [HTTP/2] [1] [:authority: xxx:8443]
* [HTTP/2] [1] [:path: /mcp]
* [HTTP/2] [1] [user-agent: curl/8.7.1]
* [HTTP/2] [1] [accept: */*]
> HEAD /mcp HTTP/2
> Host: xxx:8443
> User-Agent: curl/8.7.1
> Accept: */*
> 
* Request completely sent off
< HTTP/2 405 
HTTP/2 405 
< server: nginx/1.28.1
server: nginx/1.28.1
< date: Tue, 03 Feb 2026 17:28:18 GMT
date: Tue, 03 Feb 2026 17:28:18 GMT
< content-type: application/json
content-type: application/json
< content-length: 92
content-length: 92
< allow: GET, POST, DELETE
allow: GET, POST, DELETE
< 

* Connection #0 to host xxx left intact```

Any suggestions for debugging this would be appreciated. For example, if I could see exactly how OpenAI is trying to contact the MCP server, that might help.