Dynamic, real-time knowledge, base updating via API or URL

Can you be useful to actually have a knowledge base updated real-time for example if I wanted for example stocks prices or something like that.

Alternatively, a marketplace for custom Gpt developers that is listed in the directory within OpenAI could also work.

I don’t have the key skills to implement an api schema, but I suspect any real time data feed via API, or even a URL refresh can be done.

In either event, I would recommend that the user interface caters to a wider audience even those that have No/Low-level coding skills.

I want increase productivity and not having to constantly update my knowledge base.

!{Could not find a topic already in support - my search resulted in nothing](upload://71ky0VHypExSql06jq9HQkcD3Hb.png)

Every API has a specific path, such as v1/quote
and a method ( GET, POST, PUT, etc.).

you can refer to the API provider documentation
then copy the server URL , method, and path that you wish to use.

Here’s an example:

  • API URL: https://finnhub.io/api/
  • Method and Path: GET /quote
  • JSON Body:

If the documentation specifies that your request should be sent within the path instead of in JSON, the path would appear as follows:

/quote?symbol=AAPL and you don’t need to send it in JSON

In this case, “AAPL” is a variable. If you want GPT to search for any symbol, replace it with {variable} or {symbol}.

So, if you’ve determined what you need (server URL, method, path, and “JSON if required” ), you can go to ChatGPT, paste it, and say

Build API schema as JSON.

If your API provider is well-known, you can simply ask ChatGPT. For example, you could say:

“Build API schema as JSON for polygon.io Includes server url, path … etc”.

with GPT , You don’t need skills!

1 Like

Alrighty, I’ll give it a go and let you know.

Your steps are clear enough and yes, I know what Json is :ok_hand:

1 Like

Morning. How often do you need the knowledge base to update? And is the knowledge from a public or private source?

I can define a path for you if you share a few specifics.

1 Like

In this theoretical model, it would be ideally in real time, but I guess that as a starting point, and to facilitate deployment of a working model … We could even consider daily or hourly and then down to real time. If any, what of the variables that will determine how often we can pull the data with API calls.

In particular, I’m thinking practical information such as interest rates, etc … and beyond that news/updates/developments in a specific area of interest that would allow the custom Gpt to perform with relevant information of the day.

Is it tonne of practical applications that could be leveraged.

What’s the Next step?

Hey Dany, ok I think I have a solution. You can bring the sources into suefel.com knowledge base, and then copy the FastAPI endpoints you can add to Actions in a GPT. Refresh rate can be tuned. Let’s see how close we can get to “real time”

I’ll follow up on the DM sent as well.

I have a question along these lines. I’m building a few Custom GPTs to help in my businesses and also to help me with writing a book. I’d love for the GPT to be able to “commit a new chapter to memory”. Currently, I have added the PDF to the custom GPT and it is doing great helping me edit but when I make a new version of the book, I must edit the GPT and upload the new PDF to continue working.

I have another GPT that helps my employees with building user’s manuals for our clients. We go out to the client’s facility and collect information about their systems. Makes, Models, Serial Numbers, Locations, Etc. The GPT then does a little research and produces a user’s manual for the function of the machine along with all of the devices on the machine. It would be cool if the GPT could commit this to memory without us having to edit it and add the information for the client. If they could make updates on their own.

Does anyone know a good way to do this?

I have a quesiton about uploading a web url to the GPT assistant or GPT.

I want to build a chatbot that knows everything in the web url (sitemap) and in the conversation with the user it refers to the urls that the knowledge is extracted.

Can you help me how can i do this?

1 Like