Need some help debugging my GPTs integration with my open OpenAPI action

TL;DR: I’m trying to build my GPTs and integrate them with my Flask server hosting on Replit as a source of action. GPTs keep giving me errors, and upon looking at my flask server, it seems it’s not honoring my OpenAPI specs.

My GPTs: ChatGPT - Tic Tac Toe Game with Direct API Play
My endpoint: APIFlaskEndpoint - Replit

So, I tried very hard to integrate my API endpoint to my GPTs. Right now, GPTs is calling my endpoint correctly but it’s not getting useful back. There’s this weird errors: when I ask GPT it says the error is method not allowed (405). basically my OpenAPI config says it needs to be POST request, but I’m getting GET request, and I believe that’s why it’s failing. BTW I didn’t have any auth so I think it should not be auth related. I curled my host endpoint and it’s working fine.

OpenAI request intercepted at my flask app:

<Request 'http://apiflaskendpoint--yjianghong.repl.co/human_move' [GET]>
Headers:
 Host: apiflaskendpoint--yjianghong.repl.co
User-Agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot
Accept: */*
Accept-Encoding: gzip, deflate
Content-Type: application/json
Openai-Conversation-Id: 1138d991-a8be-5b9b-8ce9-24425ed88ce0
Openai-Ephemeral-User-Id: 771bad20-2f10-50cc-8785-18e8e26261ac
Openai-Gpt-Id: g-gLuNmwdjS
Openai-Subdivision-1-Iso-Code: US-NJ
Traceparent: 00-00000000000000002f205fed3920fb6d-5c0198402ac8a203-00
Tracestate: dd=s:0
X-Datadog-Parent-Id: 6629747527829201411
X-Datadog-Sampling-Priority: 0
X-Datadog-Trace-Id: 3395819591507704685
X-Forwarded-For: 13.66.11.98
X-Forwarded-Proto: https
X-Replit-User-Bio: 
X-Replit-User-Id: 
X-Replit-User-Name: 
X-Replit-User-Profile-Image: 
X-Replit-User-Roles: 
X-Replit-User-Teams: 
X-Replit-User-Url: 


Body:
 b''

My OpenAPI config:

{
  "openapi": "3.0.3",
  "info": {
    "title": "APIFlask",
    "version": "0.1.0"
  },
  "servers": [
    {
      "url": "https://apiflaskendpoint--yjianghong.repl.co/"
    }
  ],
  "paths": {
    "/human_move": {
      "post": {
        "summary": "This API endpoint is responsible for responding a tic tac toe game. The input is the board after human player played, and the output is the board after AI played",
        "operationId": "Provide",
        "requestBody": {
          "content": {
            "application/json": {
              "schema": {
                "type": "object",
                "required": [
                  "board"
                ],
                "properties": {
                  "board": {
                    "type": "array",
                    "minItems": 3,
                    "maxItems": 3,
                    "items": {
                      "type": "array",
                      "minItems": 3,
                      "maxItems": 3,
                      "items": {
                        "type": "string",
                        "minLength": 1,
                        "maxLength": 1
                      }
                    }
                  }
                }
              }
            }
          }
        },
        "responses": {
          "200": {
            "description": "Successful response",
            "content": {
              "application/json": {
                "schema": {
                  "type": "object",
                  "required": [
                    "board"
                  ],
                  "properties": {
                    "board": {
                      "type": "array",
                      "minItems": 3,
                      "maxItems": 3,
                      "items": {
                        "type": "array",
                        "minItems": 3,
                        "maxItems": 3,
                        "items": {
                          "type": "string",
                          "minLength": 1,
                          "maxLength": 1
                        }
                      }
                    }
                  }
                }
              }
            }
          },
          "422": {
            "description": "Validation error",
            "content": {
              "application/json": {
                "schema": {
                  "type": "object",
                  "properties": {
                    "message": {
                      "type": "string"
                    },
                    "detail": {
                      "type": "object",
                      "properties": {
                        "<location>": {
                          "type": "object",
                          "properties": {
                            "<field_name>": {
                              "type": "array",
                              "items": {
                                "type": "string"
                              }
                            }
                          }
                        }
                      }
                    }
                  }
                }
              }
            }
          }
        }
      }
    }
  },
  "components": {
    "schemas": {
      "Board": {
        "type": "object",
        "required": [
          "board"
        ],
        "properties": {
          "board": {
            "type": "array",
            "minItems": 3,
            "maxItems": 3,
            "items": {
              "type": "array",
              "minItems": 3,
              "maxItems": 3,
              "items": {
                "type": "string",
                "minLength": 1,
                "maxLength": 1
              }
            }
          }
        }
      },
      "ValidationError": {
        "type": "object",
        "properties": {
          "message": {
            "type": "string"
          },
          "detail": {
            "type": "object",
            "properties": {
              "<location>": {
                "type": "object",
                "properties": {
                  "<field_name>": {
                    "type": "array",
                    "items": {
                      "type": "string"
                    }
                  }
                }
              }
            }
          }
        }
      }
    }
  }
}

Please advice how to debug this issue, thanks!

1 Like

Bookmarking this for when I wake up tomorrow to see if I can help diagnose this further or bring more attention to it. I’m about to test out my own API calls myself, and your errors terrify me because I just have this feeling I’m going to deal with similar problems.

Glad to know I’m not the only one scrambling my butt off attempting to build my own custom APIs.

I can’t remember - is flask containerized by default? Have you verified all the IPs are correct, you’re calling to the right IPs, and your network & firewall settings are configured properly? I don’t know your skill level, so I’m only asking because a good chunk of my issues throughout my tinkering life has been network config problems.

Also, I don’t know about you, but lately my custom GPT hasn’t been the best at saving my changes to the OpenAPI spec action. Check to make sure the OpenAPI spec was saved properly as an action. I have no clue why it’s acting this way, and I’m getting random errors by just prompting GPT builder alone, and it’s been unresolvable and unidentifiable thus far.

This is a very new tool and product. Debugging our stuff like this is gonna be difficult, and it may not even be our fault. I’ll look into this more, because we need more examples of this stage of problems/diagnosis/debugging when using GPT builder, but be warned; I don’t know how long it’s going to take.

In the meantime, the closest similarity to have some semblance of examples comes from plugin docs. Check there to see if you’re missing anything.

1 Like

I could see what you’re thinking. I know little about networking, but my educated guess is that OpenAI is able to reach my server hence there’s request coming in onto my flask server, just that it’s not POST but GET.

Same here, it’s really annoying since the result is not coherent. At this point I feel the best way is after you change the config tab, you’d better just ask GPT builder simple thing like I’ve updated the config, and see if it triggers an update on GPT builder side.

1 Like

GTPs use GET method when you have POST openapi specification. I think there is some bugs in GPTs actions?

2 Likes

@yjianghong @marco.gramuglia This very well could be. Which, if it is, is consequential for all of us developing our API specs, and this would need to be resolved before the GPT store is released so we all have equal opportunity to succeed.

My only guess at the moment is from what’s in here: https://platform.openai.com/docs/actions

  • If the [x-openai-isConsequential] field isn’t present, we default all GET operations to false and all other operations to true

I noticed in your OpenAPI config this field doesn’t appear (that I see), and the docs specify the above statement as the default parameters.

Might this have anything to do with it? I really don’t know yet still, but this honest to god my best guess based on the information that’s currently available right now.

Try setting the consequential flag to false in your POST request to see if it changes anything. If nothing changes, then we should report this as a potential bug.

1 Like

Thanks everyone for helping out. At least today, OpenAI is having some weird bug: after I provide my custom action, it breaks. It breaks both the GPT builder and the GPTs itself. I tried to create a new GPTs, it doesn’t help.

My biggest takeaway though, is to make sure you always increase operationId by 1 after you change your OpenAPI specs, it’s the one way I can keep track of my specs are saved or not.

A couple of complaints on UI: Right now the UI is very unintuitive, there’s no way to tell if you save your config, or what happens if both GPTbuilder and you are changing the config, etc etc. It’s really messed up. Plus there’s no version control, so it’s really hard to tell which version is out there.

Overall I would not recommend building your actions on GPTs as of today, I think it’ll get much better once we have assistants API. Right now there’re so many obvious flaws that I just want to ssh onto OpenAI server and fix myself.

hi friend, i create a GPT that can create both code and API schema if you want to check your code on it and see what’s wrong with it:

API Alchemist URL

have a great day!

Right now there’re so many obvious flaws that I just want to ssh onto OpenAI server and fix myself.

You and me both, gurl.

I have the instinctual need to fix things, and I’m noticing the same problems you are right now actually. In fact, I thought it was a network issue, but considering I did the same thing on my own GPT, and the results were the same (it crashed both the custom GPT and the builder), I was at a complete loss.

Something is happening with GPT builder beyond what we can debug ourselves, as similar bugs and issues continue to appear across multiple users.

I have never been so terrified, considering that I’m trying to release my own on Day 1 and I can’t release a broken product, nor can I debug to resolve the bugs causing my break case

I want to use something else, but my entire development stack centered around this since the devday announcement. It literally depends on this and the future ecosystem, so I physically can’t scratch this.

This is my make-or-break moment, and I feel like it could be that way for you and many others.

They need to fix this before they release the store, otherwise the only GPTs that will be producible and likely deployed are going to be chatbot anime waifu clones. I’m gonna take a wild guess here that OAI’s intentions were to host things a little more…functional. But we can’t have function without API calling, js. Otherwise, it’s not innovation, it’s rehashing what’s already possible in the base model.

BTW: I’m seeing reports of similar bug reports on r/OpenAI too.

I think this is becoming a thing.

1 Like

I think I ran into exactly the same issue as you. I set up a development server using Flask. I tried to use ‘post’ method to request, however in my console log I can see the request seems to be a ‘get’. Even though in the GPTs Action page, it says the method to be ‘post’.
For me, a temporary resolution is to change my server method to ‘get’ and that works. That’s not helping in some cases though. I’m curious if that’s a common issue to everyone and if so, when would it be resolved.

3 Likes

Yes, seeing the same problem. Things I’ve tried.

Calling the API directly using a POST via Postman, works
Calling the API via OpenAI POST, fails, it actually sends a GET
Sometimes it does send a POST.
I’ve tried changing URLs, duplicating the GPT, exactly the same issue.

I believe only GET works at the moment for some reason. I never saw it come in as POST. It would be very interesting if it sometimes is POST.

I have 3 GPTs that work with POST and 1 that seems to try and use GET. I’m trying to work if there’s any configuration differences between them.

ok, looks like I’ve fixed my issue with GET instead of POST.

This is output from preliminary testing and it’s unusual.

It seems to be related to the server URL. This does a GET instead of POST:

"servers": [
    {
        "url": "https://xxx.xxxxxxxx.xxx/"
    }
],

This does a POST, as expected.

"servers": [
    {
        "url": "https://xxx.xxxxxxxx.xxx"
    }
],

The only difference is the ‘/’ at the end of the URL.

I’ll keep testing, but removing the ‘/’ at the end of the URL is working for me.

1 Like