How do you pass a user question as a param inside the OpenAI Spec schema

The team at Webdadi have cracked it, and unbelieveably easily! Ha. So the trick is to have a schema that works. they combined a strategy of using a prompt to convert a CURL request to a schema with an example of a working GET request schema and the combination created a perfect working result with an instruction to the GPT.

Here is a working sample schema, WITHOUT Authentication, to test your use case as either an example or to merge with a prompt to create a schema from a CURL GET request.

You’ll find plenty of help resolving authentication in the Schema using Custom Authorisation in the community here.

This API sends a GET request to my API using the querystring param input_text, where the API is simply a hosted python Flask (runing using gunicorn) and has a Flask accessible endpoint /search that is accessible over SSL. The Flask app simply performs a vectorstore similarity_search of a ChromDB that is a locally cached VectorDB, and the similarity_search just returns the top scoring answer using JSONIFY which is being used by the GPT schema , its instructions with the Action provides decent inference. Here is my working schema:-

{
    "openapi": "3.1.0",
    "info": {
        "title": "WebdadiAPI",
        "version": "0.1.0"
    },
    "servers": [
        {
            "url": "MyFQDN/"
        }
    ],
    
  "paths": {
    "/search": {
      "get": {
        "summary": "Search GPT-generated content",
        "operationId": "searchGPTContent",
        "parameters": [
          {
            "name": "input_text",
            "in": "query",
            "required": true,
            "schema": {
              "type": "string"
            }
          }
        ],
        "responses": {
          "200": {
            "description": "Search results",
            "content": {
              "application/json": {
                "schema": {
                  "type": "object",
                  "properties": {
                    "results": {
                      "type": "array",
                      "items": {
                        "type": "string"
                      }
                    }
                  }
                }
              }
            }
          }
        }
      }
    }
  },
    "components": {
        "schemas": {
            "HTTPValidationError": {
                "properties": {
                    "detail": {
                        "items": {
                            "$ref": "#/components/schemas/ValidationError"
                        },
                        "type": "array",
                        "title": "Detail"
                    }
                },
                "type": "object",
                "title": "HTTPValidationError"
            },
            "ValidationError": {
                "properties": {
                    "loc": {
                        "items": {
                            "anyOf": [
                                {
                                    "type": "string"
                                },
                                {
                                    "type": "integer"
                                }
                            ]
                        },
                        "type": "array",
                        "title": "Location"
                    },
                    "msg": {
                        "type": "string",
                        "title": "Message"
                    },
                    "type": {
                        "type": "string",
                        "title": "Error Type"
                    }
                },
                "type": "object",
                "required": [
                    "loc",
                    "msg",
                    "type"
                ],
                "title": "ValidationError"
            }
        }
    }
}

Once you have this schema working and returning responses on a TEST basis you’re away once you’ve updated your GPT intructions to be something along the lines of:

Instructions example:

The GPT is designed as a first-line support service agent, specializing in answering customer questions . It utilizes an API via the GPT’s custom action. The user will give you, the GPT, a query. You will take this query, and pass this as “query” param in the api. The API /search endpoint will return a response from which you will extract the contexts key from the result and display it to the user. The GPT’s responses should be friendly and efficient, providing clear summaries of steps for problem resolution. It should engage in conversations, seeking further context when necessary, and answer customer questions thoroughly.