A sanity check for future plugins to access private SQL databases

Hi Guys,

we (https://www.c-phrase.com) are doing some planning around a future offering that lets companies access their private SQL databases via ChatGPT.

Our 5-step model is as follows:

  1. companies launch C-Phrase VMs from AWS or Azure and run their own natural language interfaces (NLIs) over their own private databases.
  2. companies set up https on these machines, exposing a REST interface over their NLIs using their registered domains.

We have customers that already perform these steps. They are comfortable doing this because they keep their data completely private, even from us.
Now in the near future:

  1. Manifests are (semi)automatically generated and the REST API (Administrator's Guide) (suitably extended)
    accept calls from ChatGPT. Essentially by doing this, the company is defining their own plugin running on their VM.

  2. Companies install THEIR plugin as an unverified plugin in their ChatGPT account.

  3. Companies can then bring answers from their SQL databases into their ChatGPT dialogs.

There is a final critical technical issues around returning ambiguity resolution requests back to the ChatGPT user.
See the CHOICE LIST at (Administrator's Guide). We require this type of capability.

OK, any ways, we are still waiting to get plugin access. Still if anyone can comment on the plausibility of these plans, we would appreciate it.



I’m still trying to wrap my head around the security and privacy contexts surrounding plugins, but this is how I currently understand solutions that might meet your requirements.

  • First, I don’t think every solution that uses data from sources external to ChatGPT must use GPT Plugins. I’ve been building systems that use OpenAI and LLMs that also utilize proprietary data and content blended into GPT responses for a long time. This is achieved by building custom chat UIs that are able to carry on natural language conversations embellished by information from other APIs. Nothing prevents you from building a user experience that does this.

  • GPT Plugins hit a sweet spot for organizations that want their data to participate in the demand for AGI across several contexts. From consumer apps (like travel sites) to B2C conversations that thrive on pervasive access. Plugins transform ChatGPT from a clever experiment into an app store with vast B2B and B2C implications. How does this map into your business requirements? I don’t think it does, but feel free to sway my assessment. :wink:

… back to the ChatGPT user.

I don’t think you want that unless you are referring to “chat user” in the general sense. I have a hunch you want a closed and proprietary UX that advances the deep association you’ve carved out for your customers and which leverages LLMs to their fullest extent. This project comes to mind when crafting your own ChatGPT experience and possibly this one.

Am I warm?


Thanks for your thoughtful response Bill.

I agree that, when possible, custom UIs can bypass the complexities of plugin integration.
And I agree that plugins for general data sources (like travel, weather, etc.) make a lot of sense.

But consider the following scenario.

We have a private database with the records

We integrate a plugin so that ChatGPT can ask simple English questions over an end point to access
this private database. The manifest of the database gives ChatGPT a lot of context around what
natural language questions ChatGPT can send to the endpoint.

Then we ask ChatGPT, “in our database, which customers might be interested in buying a bottle of red wine”.
Now magically, because that is how LLMs seem to work, ChatGPT determines that it should ask the end point for “customers age over 18”.
It gets back the answer of ‘Samuel’ and then integrates that in it’s response.

The advantages of this plugin approach is that it would leverage LLM common sense in asking structured questions over private databases and
it would let users access this through the standard ChatGPT website at https://chat.openai.com/chat

Of course it would require some fiddling to install the private plugin. My guess is that an admin (same admin who manages C-Phrase)
would do the set up and then share with people in their company department.

Does this make any sense?

1 Like

I have to agree with @bill.french here. The idea is valid, but I don’t think going through plugins and ChatGPT is the best way.
Also, I think its not wise to develop a product that is so dependent on another external one. We are already dependent on OpenAI APIs, but now you will be 100% dependent on whatever chatgpt happens to do.


A custom UI would also leverage the power of LLMs. The only difference is that instead of doing this through common machinery provided through the plugin, it would do it in a far more controlled environment capable of doing things the plugin could not. Imagine chained interactions deeply dependent on your data and desired outputs. I believe plugins will have some limitations that wouldn’t exist in a custom chat implementation.

Indeed, and if this is a business requirement, you should do that. I have a sense that not all B2C relationships work best by asking its customers to interact with their data through a universally “consumer” facing app that everyone uses, might cause concerns. Is the interaction private? Who else can see this data and what I’m asking? These and other trepidations may stem the adoption of plugins in some B2C or B2B relationships where the mere fact that the data from the provider is perceived as “private” may not be the ideal delivery mechanism.

And are you comfortable eliminating your brand from this interchange? Plugins might be able to provide a very comfortable experience for your customers, but it will do so in a brandless environment. Your only differentiation as a data/knowledge provider will be in the results of the conversations. Everything else around these conversations will be the OpenAI brand.

This is also a potential issue. What if OpenAI decides to run ads around its chat client app? Oops! Now you have the possibility that your competitors could be tugging at your customer base while they’re trying to use your solution. Ask ChatGPT if this is a good idea. :wink:

One last item to consider…

Do you think these brands will have only one pathway to their AI experiences? Probably not. It’s not different from mobile apps in app stores. They provide an “app for that”, but they also realize customers may need other contexts. If this plays out like it logically should, we can expect to see a lot of chat pathways for every business that can benefit from AGI.

I believe there’s nothing inherently wrong with your ideas or approach as long as they align with your business objectives.


I agree with @bill.french here. While you can create a plug-in that will allow ChatGPT users to access your client’s data in the way you describe, I don’t see the motivation for providing this access specifically through ChatGPT rather than through a custom chat interface where you have much more control over the functionality/experience. Perhaps the motivation is clear to you, in which case explaining whatever it is might help focus this discussion better on your needs.


As with all software, the requirements matter more than the implementation details.

1 Like

Agreed. As a developer it currently makes no sense to make a ChatGPT plugin.
I have no sense of direction with it, resulting from a lack of transparency.

ChatGPT plugins are just…weird. I really dislike them. They are shrouded in mystery.
Why would they do this? Why do they want to manage an app store?

As already said. Why would I even bother to make a plugin using ChatGPT unless at some point it can be implemented into my own service? Why would Kayak want people to go to OpenAI’s website for ChatGPT instead of their own? Where are the benefits?

Do they expect ChatGPT to be “the” public resources for knowledge and therefore consider their plugins to be advertising space? Do they expect ChatGPT to be eventually used on their platform?

It makes no sense, which usually means that I don’t have all the puzzle pieces.


I have often debated and perhaps over-analyzed your comments in other threads. But in this one, I think you may be underestimating the depth and value of your analysis. I’m simply asking - do plugins have a place? My conclusion is still in the “maybe” column but that maybe is shrouded in a lot of business-related questions.

I think you are making a valid point by asking …

Who would assign all rights, title, and interests in authoritative knowledge to a single entity?

Is this idea of plugins not jumping from one big-tech monopolistic environment (Google Search) to perhaps another (OpenAI)?


You’re right. After all, I am just a toaster. For that reason, I hope that people question everything that I say, and that I can learn and get some depth, or even an entirely new perspective as a result. It’s happened to me more times than I can count. As I’ve mentioned before, I am not an expert, and do not want to be labelled as such. Being the kindest, I am just a stan

That’s how I feel, which is such a ridiculous move to make. I wish they could just focus on their models, and not try to take on such a massive industry. I feel like these decisions are “influenced” by very influential “supporters” who have wanted their piece of the pie for a long time. Just how I feel. I have nothing to support this claim.


Lots to consider here. Still here is a prediction:

A very large segment of users will just continue to visit their bookmark: https://chat.openai.com/chat.

Companies will deploy plugins that let employees access private company databases. The dashboards to admin plugin permissions to user accounts are probably being developed at OpenAI as we speak. (Note that the Citadel agreement/deployment will shed a lot of light.). OpenAI will be offering site licences to companies. Citadel seems to be one of the first.

Now I assume that OpenAI is also building a lot of mechanism around how it sniffs on manifests and peeks and pokes on plugin APIs. And how it stages sequences of calls. The methods they use may be orthogonal to the basic LLM prompt/completion + RLHF conditioning. Or maybe just it applied at a higher-order over highly abstracted sequences of dialogue/data fetch actions.

Still it our thesis that the best way to get at the private SQL databases that the plugins will access will be through short natural language glosses of the information needed. Not idiosyncratic API calls or company specific SQL. So we are trying to position C-Phrase to bridge this gap. That is the reason why we are interested in experimenting with plugins.

BTW we still don’t have plugin access yet. Has anyone managed to get access to plugins? Any ideas about how one can speed up ones case?


I don’t know about that.

Realistically, if I want instant information such as flight tickets I would just use the companies actual application.

I like my taps, I like knowing that pre-defined actions have pre-defined results. I like pure functions when I am dealing with sensitive information!

In the example of flight tickets, okay, so I can see how much Kayak reports that it costs using ChatGPT. Now I enter a structured NLP process for payment?

What the heck is different from this, and simply just using the application? I open ChatGPT instead of Kayak? Why would Kayak even want this?

I don’t understand the obsession of using NLP to solve everything. I don’t think I have ever thought to my self “Jeez, I wish I could just talk my way through this application”. Quite the opposite, I usually think “I wish this process was slightly simpler, tap, tap”.

In regards to Kayak, the application is great because I don’t need to speak to someone to buy the tickets anymore!

On top of all that. Based on the documentation there is no hidden magic behind ChatGPT plugins. You write a manifest, the description is used for, and then GPT decides what you want based on the injected context.

Sounds pretty dang similar to what we are all already doing. With a lot less controls and freedoms.

1 Like

I agree with you on many people preferring point and click navigation to get things done in applications. Schneidermann had a famous paper where he really dunked on natural language interfaces in favor of direct manipulation.

Still I think the potential is that ChatGPT becomes a very dominating interface where people start to actually get more and more done via dialog. Still all that typing is going to be painful. We have handled this for the limited natural language to SQL database case by letting users click build questions and commands in addition to typing.

Perhaps ChatGPT will adopt similar methods…


More to the point of what you said.

What if ChatGPT gives you a concise summary of your order and then asks you to tap (or even, yes type) ‘yes’ or ‘no, revise’ to place it?

Seems like these confirmation dialogues will become pretty standard pretty soon.

As for the magic of the API calls from sniffing on the manifest, it will have to be sophisticated to actually work in a general way. ChatGPT is actually not so good at generating SQL.

Sure we could have used a better prompt and more advanced model. Still…

1 Like

Good question. I would definitely be happy with that.

I agree with all that you have said. I truly believe that in n years we will speak our ways through the days.

Order groceries in the kitchen while we prepare ingredients, plan vacations while cutting vegetables, and then have an automatically generated schedule for the rest of the day.
Or even just order chinese take-out with a single “My favorite, delivered in an hour please” said from the living room couch.

None of this technology is ready though. ChatGPT plugins will be massive. It’s an app store. I can understand the reason for it, I just want OpenAI to focus on their dang models, and not focus on complete domination of the inevitable AI-entwined future.

Stay on one path, and let others take advantage of the technology to venture off, and create separate paths. Ya know. I’m sorry, this doesn’t have much to do anymore with your discussion.

Going back, text-to-sql is the hot topic it seems lately. I have been using an ada to perform simple filtering for my database retrievals (using vector databases though) and it’s working very well.


Indeed. And they will generally go there for general tasks. Increasingly, consumers and businesses will switch from Google search to ChatGPT for various answers and helpful completions, conversations, and content development.

Do you or any of your customers presently use Google search to access personal or private company data?

It’s a silly notion, of course. But why didn’t search plugins evolve to do this if they are such a great search engine? The public is vastly hooked on semantic search but apparently not enough such that they wouldn’t demand direct access to their private or corporate data to use the same UI to access everything

The answer is obvious - it’s a weirdly stupid idea fraught with many serious issues, the worst of which is trust.

Plugins will be massively successful, but probably not for the use cases you’re suggesting.

The first rule of plugin access: don’t talk about plugin access. :wink:


The first rule of plugin access: don’t talk about plugin access.

Good point. I wait patiently…

As for the case of people/organizations adding plugins to their private data over the general interface, I still think it is a possibility.
But I grant you might be right. Still I don’t think that the google case applies here. Chat GPT could, in principle support PIM (Natural Language to SQL for PIM - YouTube)

Time will tell. Still I can bet that OpenAI will want to draw people through the standard chat interface. It wish I were a bug on the wall at Citadel to see how that deployment is rolling out.

1 Like

Oh and now that OpenAI is big, people/companies will trust it.
I use google docs for a ton of things in my company. Also there a lot of other companies I trust with sensitive information. Hubspot, AWS, etc. But I basically trust that they won’t leak details…

1 Like

It’s not a possibility; it’s happening. But we need to add a few predicates to the notion of “private data”.

When it comes to commerce, Google Search doesn’t collect your credit card information to transact with a seller of airline tickets on your behalf. Instead, it collects a vig for connecting you to the seller. Is it possible OpenAI will attempt to compress the transaction through plugins? Probably. But the private data they may collect to effectuate commerce transactions seamlessly is not the same as a proprietary database of drug formularies that will help healthcare professionals lower prescription costs for their employees.

I think @curt.kennedy summed it up the best here. OpenAI will most certainly exploit the use of “private” data.

Once again - predicates. Trust LLM providers with what data exactly? All data? Probably not. I can think of dozens of companies that will want deep integration through LLMs, but will insist their data will be entirely within their application environments.

1 Like