GraphQL introspection as an alternative to an OpenAPI spec for plugins

Currently a chatGPT plug-in needs needs to implement a REST API server with an OpenAPI spec. Every graphql API automatically comes with formal, machine readable, documentation (via introspection queries). Considering that there is a rich ecosystem of tools which expose a graphql API ( for instance, hasura; disclosure: I work for them but speaking for myself), I think supporting graphql apis in addition to REST/OpenAPI’s could be really powerful.

(a similar question was asked here, but it’s not exactly clear what they mean: /t/graphql-support-for-chatgpt-plugins/181748)

14 Likes

I assume ChatGPT plugins will eventually support GraphQL, but would love to know if it’s on their near-time Roadmap.

Right now, I am not really doing “REST” but I am more focused on getting consistent results than on following some orthodoxy.

I think GraphQL support would be a great addition to the plugins.

There are already a lot of tools that use graphql and it is easy to integrate it with other apps.

I would like this very much as well. GraphQL seems like it would be much more data-efficient vs REST as an endpoint for “Create a GPT” actions.

My workaround so far has been to feed the GraphQL schema as instructions to the GPT API. It works, but official support would be so much easier to work with and probably perform better in more ways than one :slight_smile:

1 Like

robertherber, do you mind sharing your approach in more detail? It sounds interesting.

Its by no means perfect at this stage, but I find it works 90% and could probably be improved further.

I basically specify a single OpenAPI endpoint POSTing to /graphql. Together with this I’m sending the schema as a system instruction (with some general description of the APIs use case, and an example or two, just to give it something to work with).

The cool thing is that GPT practically corrects itself (if you have unmasked errors) if you feed back the errors from the GraphQL API :slight_smile:

Obviously it helps a lot to have descriptive names of your queries, naming things just got even more relevant :slight_smile:

do you have an example of this working? i cant seem to get it to read graphql…

Sorry, it’s a bit too entangled right now. But just to reiterate:

  1. you need to feed the GraphQL Schema to GPT yourself - it wont do the introspection for you since it doesn’t understand the concept of GraphQL at this time.
  2. And then you need to use function calling to make the actual call to your API.

Thanks for the reiteration. I did a quick workaround by proxying my graphql query via server, then i can utilized the gpts via openAPI schema.

Hey Rober, I’m testing a similar use case but finding that the schema is too big as context.

It’s kind of working passing a subset of the schema but that’s not ideal.
Is it working for you because your schema is small enough or have you looked into an alternative (embeddings, fine-tuning)?

Thanks!

For sure. If there’s a way around it it’d be great, so you have a suggestion? The best would of course be if it got natively supported by the function calling. Is there a way to make OpenAI prioritize this?

We previously tried a similar approach: using RAG to retrieve relevant schema and feed error back for self correction. it generates better query than onepass but but the problem is it still doesn’t work very well on complex schema and can take multiple iterations (thus long latency).

Would love to see if there’s any update on road map for official support.

Being able to upload a graphql schema file would be great too

1 Like