I want to create GPTs , feed them knoweldge and want my work colleagues to access that knoweldge - how?

Hi Community, This is my first post here but I haven’t found anything easily in other threads. The situation is the following:

  • we are a medium sized startup and now work more and more wiht chat gpt. Several people already have pro accounts and now we want to bring those together into a COrporate workspace to be more effective with sharing knoweldge and data.
  • my Usecase is that I personally e.g. have many Chats already that are extremely knowedlgeable about our company, producst, customers. etc. This knowedlge that was build out of many of my prompts and feeding data, I want to share via a Custom GPT with my COlleagues.
  • we also use Snowflake as data warehouse and want to make it easier for other colleagues to get out the SQL queries that where already built for example. So I have tought alot of information already to my SQL Query GPT and Id love if that knoweldge and the information aobut our Warehouse and tables etc,. would, also be accessible to my colleagues.

is this possible? if so how?

I tried already creating a Custom GPT and fed it with shitloads of text in the chat. But when starting a new Chat asking about this information, it can’t provide me any of that or only very high level. I thought every time I share information in the chat, I provide more and more data to the GPT that other Colleagues then also can use. Am I mistaken here?

Can you guys guide me into the right direction how to achieve this for our comapany or what I need to learn/read about to get there?

Thanks!

Greetings
Adrian

1 Like

Sending replies to the GPT doesn’t mean it will remember this long-term as current technical limits prevent this.
This is called context length, you can read up on this here.

For you to connect it to a SQL database or similar, I can only recommend Llamaindex.
This is basically made for exactly what you want, especially huge data that it should know/remember.

Do let me know if you need any more help! :hugs:

1 Like

You need to build a custom gpt and upload to its knowledge 20 docs max then anyone that uses it can use your knowledge

2 Likes

Hi Mitchell, thanks for your reply. So I can “just” paste all that string knoweldge into a doc or gdocs and then upload it and that is how it can always access that knowledge then? Whatever is in Chats, is not “teaching” or providing any new knowledge I now understand. I thought first I can just “chat” with it and provide the text there but I understand that this doesnt work. Ill try the Upload logic then first. Should suffice and i mean I can create different GPTs of course for different topics. Any limitation of the size of Docs that I can upload?

Yep name it and if you say read knowledge “name of doc” it will for anyone

Example

It can hold 20 max docs and click upper right corner of pink doc tab to remove them.

Hey @j.wischnat understood. This might be something for later stage. For the beginning to start that more people interact and use it, do you think it could also work if we uploaded written knoweldge about SQL QUeries inside the Knoweldge upload? E.g. we have maybe 10 tables that are needed for most data queries we have. If the definition of those tables is uploaded, would then the GPT digest this knowedlge when creating SQL queries? AND: could I also basically in text form have docs uploaded with Specific SQL QUeries and some written CONTEXT that would be used as Knowledgebase?

goal for the shortterm should be that some non dev/data people would be able to get some queries out and run them in Snowflake.

1 Like

It won’t do meta data like chart formats or images from docs it reads text.

1 Like

also it is read only it won’t write to knowledge

means, If I provide a doc with a long SQL QUery, that doesnt mean that another colleague can ask “share me the QUery for XYZ” and then it will throw out that query?

Yes it will exactly as written

so the Solution I am looking for is then quite “advanced” to achieve with todays Level and not easily “buildable” without significant investment? I am trying to understand how I can achieve such true knoweldge Sharing within our company. :thinking: Cause the solution like llama index is of course also nothing that I quickly set up without time and extra cost etc.

I build RPGs so format is important. If you set it as a structure it will read it perfect but if it is a image or a structure like a chart with no syntax or logic it won’t read it as a chart “meta data” so if it is just straight code it reads perfectly.

1 Like

No just upload your docs and have it read them anyone that uses your custom work can access um. For a $20 plus account just share the gpt with link. Put them your docs in upload knowledge, once they are in it and have it read them. Save your gpt… “ read knowledge” you can even instruct it to list an index of all knowledge at hello…

Example in instruction say “ at first user prompt list all documents held in onboard knowledge” that’s generic so it just lists knowledge load.

Like this

You can even have a read first document that tells the machine how to function and use your data but with no instruction it won’t work. You need to tell it how to handle your uploads.

Example instructions at first user prompt read index.txt or w/e have that set how to read the docs list commands or help ie you can put instructions how to use your gpt then folks could access what they need from help functions

3 Likes

Ok. I give you an example case. I would like to upload a doc with our Sales/Marketing Ideal Customer Profile, the values, the problems, and information about our products etc.

How would I need to teach the GPT to digest the data so that any other Person could ask Questions where the GPT should use the provided data together with the own open AI data to give some strategic advice etc. Would that work? Or is it not able to “mix” my data with its own data to combine it?

Have a set of rules in instructions explaining your data structure if you use docs you need to tell the machine how it should access um. IE index sales doc at user prompt sales index

You can format your data through GPT instructions have one doc that is a read first that indexes your data documents
In instructions tell it to read that first at first user prompt and list functions help. If you index each doc by name with sub indexes in docs it can read and if you put how to read in instructions it will follow your function format

It won’t cost you much but it may take you time. You have to explain to it how to organize the data

You just need to build your functions

Without that it’s just data it needs structure and gpt needs to be trained to use it IE told how to.

1 Like

Got it. I had hoped that we are already further wrt. ease to create such GPTs. As you said, it requires quite some knoweldge and needs time. I hope that such usecases will get some attention from Open AI soon. I bet that thousands of companies have the same idea/problem where they would love to use the Knoweldge of Chat GPT itself and enhance it with some own knoweldge. However right now to get there is not really easy for non technical users. Funny thing is, that with my own CHATS it works. They have hundreds of prompts saved from me and take the information into consideration when beegin asked something… unfortunate that the same is not possible for shared GPTs.

I mean, as @mitchell_d00 already pointed out, this is possible.
It is a simpler form of the more advanced implementation with llama index, of course it will have some limitations, but ultimately, if you’re not passing a TON of information, @mitchell_d00 solution will work just fine for you.

2 Likes

As long as you view it like eating an elephant one nibble at a time even massive amounts of data can be indexed and sub indexed. As long as the system understands your format and you keep out human error in the initial index and function rules it will follow its rabbit holes :hole::rabbit::heart:

Basically it is a LLM seed based in onboard knowledge.
IE you explain exactly how you want it read.

2 Likes