GPT Action Works Locally but Fails on Remote Server

I was able to run my own Plugin server and connect it to a GPT Action locally (using localtunnel). However, when I host the exact same server code remotely, it does not appear that the GPT Action is even contacting the server.

Here is my openapi.yaml:

openapi: "3.0.0"
info:
  version: 1.0.0
  title: Example Plugin
  license:
    name: MIT
servers:
  - url: https://sub-domain.mydomain.com
paths:
  /search/{query}:
    get:
      summary: Search information from database
      operationId: databaseSearch
      parameters:
        - name: query
          in: path
          description: The user's current question.
          required: true
          schema:
            type: string
      responses:
        "200":
          description: OK
          content:
            application/json:
              schema:
                $ref: "#/components/schemas/getQueryResponse"
        default:
          description: unexpected error
          content:
            application/json:
              schema:
                $ref: "#/components/schemas/Error"
components:
  schemas:
    getQueryResponse:
      type: object
      properties:
        queryResponse:
          type: array
          items:
            type: string
          description: The retrieved information pieces from the database.
    Error:
      type: object
      required:
        - code
        - message
      properties:
        code:
          type: integer
          format: int32
        message:
          type: string

The only difference between running this locally and remotely is the url I provide for servers.

My hosting stack:
I’m hosting the server on AWS, and this is the server architecture: Subdomain (CNAME Record) → AWS Application Load Balancer → Target Group → EC2 instance running the server.

What I’ve checked:

  • My domain has a valid SSL certificate (for subdomains as well) and the load balancer is accepting connections on port 443.
  • I can access all of the endpoints from the browser and they work (no auth)
  • My robots.txt is:
User-agent: ChatGPT-User Disallow:
  • I do not see any attempts from openai to connect to my load balancer in logs
  • When I ask the GPT what the error was it tells me that it is a ClientError, and the response I receive from the debug is this:
[Debug] Response Received
{}
Error talking to
  • My CORS policy contains the appropriate https://chat.openai.com

Given that I don’t see any requests from openai to my load balancer I believe I’m doing something incorrectly on the networking side of things, but I have tried everything I can think of. Please let me know if I’m missing something here!

4 Likes

I also can’t get my GPT to access an external APIs. It’s so frustrating since I can’t even find any logs on the issue!

4 Likes

I found the issue! My load balancer on AWS was not set up with a certificate chain (which is optional on AWS certificate manager). As a result, GPT’s code to access my api was blocking the responses. Browsers take care of retrieving the certificate chain themselves if the server doesn’t provide it, so that’s why I could access the api from browser.

Same issue with a fresh new domain from Namecheap, cannot verify.