Integrating GPT with external database s

I have account with Ycharts for Financial data and analysis . I was wondering if we are able to provide credential in chat gpt so it can access the data and extract it based on prompts . I want to use create a GPT using specific requirements and data can be accessed from my account in ycharts . Is this possible ?

1 Like

you can do it If the destination has the following conditions:

  1. There is nothing that interferes with accessing content such as java, robots.txt, login. In the case of login, it means the page that the person who is going to view must log in to see the content. But some things are questionable, such as: Can I add a link? https://platform.openai.com/playground for MyGPT to use as data. Even though I have to log in

  2. Content should be directly usable information, such as text or images on the page. If separated to any other page You should include a link to that page.

1 Like

yep, with a simple API on top of your data, if you don’t want to integrate directly with the data source

Yes, I have a lot of experience doing this. Best approach is:

a) use the external database API to ingest knowledge / sources into a simple middleware.
b) create an API from your middleware
c) copy the OpenAPI Schema from that API and paste into GPT Action.

I can provide more detail if needed.

But this approach in general allows you to more nimbly adapt as the knowledge sources / external database inevitably evolve over time.

3 Likes

Hey Cass, are there any limitations to consider with this approach? In terms of the amount of data that can be integrated through an external database via a middle layer setup connecting via an API to the custom GPT action?

I was wondering if I would be able to connect our relatively large database from Big Query to the custom GPT in ChatGPT for advanced data analytics.

1 Like

Nice reply Your answer inspired me a lot. Hey brother.

In my experience integrating OpenAI API with external sources, the limitation is the context window (128,000 tokens the largest one) if I’m injecting the data in the system or user prompts. A different limitation would be if the middle layer exports the data as a file and you use an Assistant with file_search. The limitations are here.

I was wondering if I would be able to connect our relatively large database from Big Query to the custom GPT in ChatGPT for advanced data analytics.

You can and, while GPT may not be able to ingest the complete database, you can do some pre-processing of the data to work around GPT’s limitations.

1 Like

I would say there are not really any limitations. That should be solved by pagination in the API. Also if you know what you are looking for you are easily able to pull it up, add, delete, etc.

Here is a video I just did https://youtu.be/LItw8qJxkas?si=WPLgwK0TTFESO0xm creating ChatGPT - MyAirTable with my team of dev’s ChatGPT - Your Freindly Tech Hub. Even when fully building this I did not have limitations on data. Also from other threads I have read it seems pagination would help or solve that problem. Now true you can’t have a super long output - but you can set the GPT up in a way that it knows how to “output the output” lol.

2 Likes

It is possible. Thanks to actions you can integrate any external data source with ChatGPT. What you need to do, is to make data source accessible with REST API. This might be technically difficult:

  • you need to write backend code,
  • you need to deploy code on server and assign some domain,
  • you need to create OpenAPI schema with your REST API documentation.

However, the full process can be greatly simplified. I’m working on open source solution for serving Python notebooks as web applications. It can be used to serve notebooks as REST API endpoints. What is more, there is option to deploy notebooks online with one click in our managed cloud service. Using this solution, I created ChatGPT integration with Google Sheets and ChatGPT integration with Postgres database..