The new ai powered Google BARD is a rival for ChatGPT

Inferencing costs are different from LLM development costs. The former will continue to drop to the point where the cost of providing an inference will be less than the cost of accounting and billing for them. The latter will continue to plummet, as witnessed in the LLM open-source communities.

That’s good. It’s wonderful that OpenAI (and other commercial ventures) find customers willing to pay for these services. However, there will soon come a time when the economies of scale and use cases will alter consumer demand. Switching costs will be pretty low and possibly near zero.

Plugins have their place, but those places will quickly narrow. Determining where plugins work best is shaped by three business drivers. You can’t have a ten-minute latte break at Starbucks without overhearing a conversation about GPT Plugins and how they will revolutionize customer experiences with brands and real-time data.

These grandiose AI visions in a web context represent extremely early, deeply flawed assertions. Here’s why…

  1. Brand matters . Businesses know that customers care about brands which serve as signals for quality and trust .

  2. Security and privacy matters . Despite all the security measures that may be possible with GPT Plugins, the perception of security and privacy will be difficult to convey.

  3. Entities matter . Customers don’t telegraph this requirement, but in matters of information, they care deeply about who [exactly] they are getting advice from.

To overcome these challenges, OpenAI Plugins must provide an experience that quells these customer-sensitive requirements.

Businesses need a cake that customers want to eat. GPT Plugins are Cannolies; businesses and their data are the creme-filling.

There’s nothing inherently wrong about GPT Plugins wrapped around your tasty business data and processes.

This is why GPT Plugins will be very successful. It’s akin to the success of the pay-for-click model that Google used to dominate knowledge discovery for so many decades. However, because we don’t complete our travel arrangements in the context of a Google search result, we also may not desire to use GPT Plugins to engage in commerce.

We will taste the delightful creme-filling of integrated AGI with our brands, our data, and even our personal knowledge base. However, the cake will probably be eaten elsewhere.

2 Likes

Yeah, as evidenced in Cannoli or Cake?, I fundamentally agree - plugins have great promise in a given wheelhouse. But, you correctly point out that another shoe will drop. Both of your contingencies are occurring at a significant pace.

Accuracy across the AGI segment is rising. AGI is finding it’s way into the environments where people work. Will they leave the places they’re accustomed to working to use a plugin when they don’t need to? I think we can all agree that the best plugin is no plugin.

I agree. But they aren’t concerned about the implementation details. They’re more concerned about the benefits. If those economic benefits come to them inside the tools and applications they already use, a plugin running in a web app over at an AGI purveyor’s web site will not be that interesting.

1 Like

Agreed

I’m also excited by this future, OpenAI will also have to update their offerings at that point to stay competitive. If my 20$ aren’t used for inference they’ll have to provide me with more stuff to sweeten the deal :laughing:

I think you are overestimating plugins. Their functionality is easy to accomplish without being forced into using ChatGPT. The limitations greatly overshadow the benefits. I believe that they are wonderful for people in any sort of domain to connect to their existing server and have a usable product. But, for developers they are simply a restriction.

In your case, plugins are a perfect match. You want a conversational agent as a focal point, you want to bounce your thoughts with AI. You want to use ChatGPT plugins to augment ChatGPT itself. You need to take a wider perspective though, or atleast consider the perspective of the people who are “underestimating” the power of the plugins.

We are moving towards intuitive applications, gestures, icons that we “just know” what they mean. Commercial applications can display information in a pleasing, and rapid way, and allow for multiple interactive options, instantly, within a tap or two. I can, for example, create an account within 1 minute using Google sign-in, if it’s even necessary. On the other side, there’s the potential of having to not only create 2 accounts, but provide my credit card details twice just so I can try out a product in the plugins store.

Kayak for example, has the freedom to build their application to perfectly suit the needs of the person exploring it. Videos, photos, reviews, all important information condensed and concisely displayed so we can read it instantly, and move on. I can consider 20 different flights within a minute without reading more than 2 sentences. There is no benefit to limiting oneself to a story-book text RPG (and that’s only for the consumer perspective)

Even fast-food restaurants have line-ups for the computer kiosks instead of speaking to a cashier.

This doesn’t even consider the hallucinations of ChatGPT. What happens when there’s new information, for example regarding a country or city ChatGPT isn’t aware of? Or it makes information up? What if it says things that were true in 2021, but not now? It’s very important and, honestly critical that you DO know your domain when speaking to ChatGPT, so you CAN catch the hallucinations.

2 Likes

Indeed. But I think the $20 serves as a qualifier to some degree. OpenAI, unlike Microsoft and Google, is running on the open Interweb. Account-based inferencing services remove certain risks and provide the vendor with the ability to monitor and remove nefarious actors. OpenAI really doesn’t have a choice - it must charge for access.

It’s my feeling that eventually, customer-facing apps like ChatGPT will fall out of favor. These are demos that stuck. But their stickiness is probably temporary. Over the long run, these apps will continue to draw in new users whose lifecycle will repeat.

also anthropic AI, and its language model Claude, that will be sponsored by google

Try them all, and if you feel you’re not getting your money’s worth for $20/month, don’t subscribe.

I disagree, sir, though from a business perspective. I see you are building very complex systems, and in that case, limiting AI access to, well, anything, might be limiting. But I find for my [significantly less-complex] use, they’re very focusing.

I’ve been using the GPT-4 Plus for like two weeks now, and it’s changed my world…and I can’t tell you how much I freaking love the plugins. Yay!

First, I am excited to pay $20.00 per month on a subscription model. Like Ksaid, said:

For me, $20 per month is just symbolic of me spending money with OpenAI rather than Neflix. But in paying I also expect more things… like being able to use plugins! Yay! I think it is healthy to see this type of fee structure. While I agree that systems constrained by fees impact innovation, I think that impact can be positive when it fosters competition in the spirit of excellence. But I don’t think either OpenAI or Google will be facing funding difficulties in the near future, whether we all pay for a subscription or not.

And, to tell you the truth, I would be willing to pay more than $20 a month for certain features—like giving GPT-4 a longer memory. Just imagine, a single conversation with infinite context… :drooling_face: $35 a month would be worth it.

And, OMG you guys! Plugins are so fun! Yeah, you can only choose three per chat, and that IS definitely limiting—but that also helps focus the conversation. I’ve found LinkReader and Zapier already indispensable, and I have fun picking one plugin that is more specific to what I am working on. BlockAtlas is an econ nerd’s new best friend. GPT-4Browsing is still pretty buggy, I’ve found it more effective to focus link-reader on a single page.

Thanks for the in-depth comparison to BARD in this thread. Even two weeks in this conversation tells me it’s all about context. BARD sounds pretty cool, and I’m eager to see it become a competitor, “the greatest athlete desires his opponent at his best,” (Tao Te Ching) but it doesn’t sound like its there yet.

Then look at getting API access, with pay as you go pricing with GPT-4 with a 32k context window (~40 pages), and drool away. Cheaper that $35 too for most folks.

I know 32k is in beta now, but when it becomes more widespread, this is what you should seek.

This is wonderful, and I’m really happy for you!

I don’t think I’m making myself clear. Let me take another stab.

Plugins lure users into ChatGPT deeper to experience AI at a level that probably feels empowering and euphoric. I sort of sense that from your exuberance. And for some users, this could be very meaningful like it is for you. I do not discount that benefit, and I encourage the use and celebration of plugins for everyone where the following business drivers do not matter.

  1. Brand matters . Businesses know that customers care about brands that serve as signals for quality and trust .
  2. Security and privacy matters . Despite all the security measures that may be possible with GPT Plugins, the perception of security and privacy will be difficult to convey.
  3. Entities matter . Customers don’t telegraph this requirement, but in matters of information, they care deeply about who [exactly] they are getting advice from.

If none of these requirements matter, plug-ins are a huge win. But, if any of them matter even a little, plugins may ultimately represent a cul-de-sac.

In addition to the three points noted, there’s a fourth unstated expectation -

Fluidity - workers tire quickly of copy-paste. In this decade, they expect near-perfect integration. They also have signaled without question that computing systems meet the minimum performance of the Doherty Threshold.

The tax that plugins impose on users is the disconnectedness they may experience from their usual and customary workflows or the environments they are required to use to work or prefer for personal computing objectives. Let’s face a simple reality - many of the prompts we build are based on texts already typed in another application. Many of the outputs from our prompts need to exist in the apps we already use. And most important - much of the data that transforms AGI into its greatest value exists in the apps we use every day.

The tax is not the $20 fee. It’s the disjointed actions necessary to enjoy AGI and utilize its massive benefits. I’ve been told that each day approximately 250 million copy-paste actions occur in Open AI’s web apps. How long will workers continue to do this when there are deeply integrated AGI features where they work and play?

I’m no expert concerning the future, but unless plugins become good enough to compel companies and their workforces to spend a lot more time in ChatGPT and plugins become so powerful that they begin to serve as workplace proxies, I can’t see a clear path that makes plugins a ubiquitous part of computing.

There is a path that I can see - integrated AGI functionality living inside pretty much everything from your note-taking app to your OS. I believe this is where we will perform the vast inferencing the world will do over the next fifty years. And it will become mainstream, and it won’t be called AI.

2 Likes

Oh, well sir, to tell you the truth all of the copying and pasting right now is absolutely ridiculous. I’ve been trying to figure out a way around it since day 1 using ChatGPT (2.5 weeks ago, now). Fluidity is everything, and it was what I was initially using plugins to achieve.

For example, imagine trying to get ChatGPT to analyze a bunch of tabular data. I was facing a major hurtle with that character limit in the chat prompt. Until, boom, the Zapier plugin, speaking of quality and trust. I was impressed to see such a “name brand” in the plugin store, by the by, and immediately tried it. (Pretty dope. For example, it can read up to 500 rows of a given Google Sheet. It really helped out some analysis.) Though, you are otherwise right, seeing a few more “name brands” in the store would build a bit more trust—it’s also bleeding edge technology. We’ll see some bigger brands in there soon.

I can see from your top-level view that plugins may create disconnectedness—however, I think the very scope of your view may lead you to see, perhaps, how the folks down in the trenches might not be able to see so far? What I mean is, even though plugins create a limited environment versus the vastness available, I think that very vastness is daunting to the average person, and that the plugins can create a “training wheels” type of feel. “Here, some extra intelligent help making your traveling arrangements… and oh, yeah, zapier.” It makes it a bit easier to digest while people get their heads around the fact that it’s an incredible time to be alive.

While I agree, we’re seeing an early-stage free-for-all (before big brands swoop in and buy the lil’ guys out), I think that will change in a matter of months. There’s already a Zillow app. Meanwhile, the “popular” section of the app store seems pretty stable—which I mean the apps on there are consistent.

I didn’t start to trust ChatGPT until I saw some of the prompts possible… and to tell yo the truth, I enjoy typing all of my prompts live, except the ones I use all the time.

This all leads back to the problem of fluidity. Those wasted seconds copying and pasting are a big deal. Especially when it comes to AI and context. Like I said, I think people and companies will be willing to pay larger subscription levels if it means having access to an “infinite conversation,” or a version of ChatGPT with a long memory.

I don’t know if you’ve noticed, but ChatGPT-4 Plus was just given a “chat sharing” feature. Naturally, what I would really like is a way to share chats within ChatGPT itself without having to leave the app. It’s short-term memory loss is a misleading factor that is going to cause a business uproar if it isn’t handled directly. Anyway, you can get a little bit of what I am looking to do by combining the new feature with the Link Reader plugin, which is a marvelous little tool… the main strength/weakness of which is that it can only read the public internet, and only so much of it. So even if you share a ChatGPT chat with itself, it will only get so much of the conversation.

These plugins that are being created are a great way to see an innovation work in the live market before making it apart of the, uh, how do you say, ChatGDP interface… well, if plugins could change the interface.

I think a few of those key plugins are already operating at the power level you’re talking about. It might be Level 999, friend. I just think that the best ones are the simple ones that give Chat a bit more access to itself and the internet, and meanwhile it’s making the vastness of one of the first super intelligences more accessible to the average person.

In the future, I think Bill Gates recently said the winner of the AI race would be the one to produce a personal assistant—living in everything, like you said. And that certainly makes sense to me; but that level of AI symbiosis with daily life must needs be gradual… and it would also help if said AI could remember what it was talking to you about for more than three days at a time.

1 Like

Yay! Well I am interested in that, but the API is a bit outside of my abilities right now, but I’d like to learn more. (I am only really talking about a point-and-click solution.) Thank you very much for pointing me in the right direction. Do you have any other thoughts on the matter?

1 Like

With API access, you get point and click “chat” through the Playground. So you don’t need to be a programmer to use the API. This is something often overlooked. You basically get “unlimited” chats (or limited by your max spend you set on your account), and no hourly caps. The responses are streamed back in real time, so you don’t wait for lag in the request.

You can also influence the tone of the chat using the System message. Or just leave it blank for general responses.

Screenshot of Playground, for those without API access:

1 Like

Those are great comments and all very valid points.

Indeed. Nothing wrong with crawl, walk, run - it’s a perfect transitional progression.

Many exceptions make sense for plugins. And I think brands will eventually swarm into the store like they piled onto Google Search ads. But, will brands entrust the entire customer activity to another brand (i.e., OpenAI)?

1 Like

Im, so sorry. Its taking time to admit. That, this article was ‘half ai generated’. Tho, i have a knowledge about these. I write it, but i wanted ai to make it more “perfect” and more formal. Im so sorry.

Oh, yes, of course they will, and already are. That’s Capitalism, right there. Not the consumeristic BS that builds plastic continents in the sea; but the real kind that is built on good ol’ fashioned gumption and integrity.

ChatGPT is already a “household name,” that timeless goal of marketers everywhere. This whole thread has been about the ChatGPT brand vs the Google Bard brand. ChatGPT is new, but is earning trust through (apparent, and one hopes, actual) openness. Nevertheless, a rivalry the likes of which will rival Coke and Pepsi has been born.

I think wide adoption is already happening. The speed at which it is happening is unprecedented. I also think the number of companies who are working with the API how @curt.kennedy has suggested is understated. They’ll be more public about it once the “AIpocalypse” fears simmer.

I don’t see the connection here.

I agree, but the wide adoption is not attributed to plugins. Most people are not satisfied with plugins. Even a recent interview with Sam Altman has indicated that plugins aren’t performing as well or, at the very least isn’t accomplishing a Product-Market-Fit (PMF)

He suggested that a lot of people thought they wanted their apps to be inside ChatGPT but what they really wanted was ChatGPT in their apps.

ChatGPT, and OpenAI by extension has been more and more closed. They have not been earning any brownie points for being open. Ironically, OpenAI has requested that the above article to be removed, which is why I used an archive link.

I think your points are very fair though. Plugins in theory are a great step forward. But, as a developer, plugins are very limiting. As a company, plugins strip a lot of the benefits of having an application. I think there are some very great plugins. But the way they are being marketed is unreasonable.

I thought the article was good. Probably the closest thing to a roadmap I’ve seen, backed with reasoning and logic. Not sure what was in the article that OpenAI wanted removed.

Agreed. I really liked seeing that there was SOME sort of idea of what’s going on so I can properly build my products to match. Everything that was mentioned in that article was wonderful.

Even just understanding their philosophy moving forward would help me tremendously. Because of this lack of communication, lack of any sort of guidance, and this hyperfocus on plugins, I have been moving to other products.

1 Like

Trying to objective here, when compared to their peers, OpenAI has the same communication pattern as most other tech companies in AI.

You get big announcements (mostly out of the blue), etc. But comparing on metrics, thinking Google vs. OpenAI here, here is what I see …

Both have blogs:

OpenAI Blog
Google AI Blog

Both have research to freely download:

OpenAI Research
Google AI Research

But clicking around, it’s obvious that Google is mostly a giant research farm for the AI community. They have way more research than anyone, and their blog points to their research!

OpenAI, on the other hand, took that research, years ago, and started building products! I can appreciate that! I have no idea why Google sat back for so many years on AI, especially since they were in command of the research, and have so many resources to develop these large AI projects.

Besides the “Big Tech” players, you also have open source. Accoring to the current Huggingface Open LLM Leaderboard, the top model at the time of this writing is falcon-40b-instruct. The models are ranked on a variety of tasks, but I’m guessing GPT-4 is still king (which isn’t listed since it’s not open source). But here, still no research, the model card says “Paper coming soon :blush:.

But there is another elephant in the room when it comes to LLM’s and transparency, which is, even if you dowloaded the whole thing, could you run it? Currently these large models take huge specialized machines to run.

I saw that in the link above, it mentioned Sam Altman was thinking of open sourcing GPT-3, but then had second thoughts on if anyone could run it. I’m still in favor of open sourcing it. We saw when Facebook open sourced LLaMA, we get a C++ version that can run on laptops (llama.cpp)!

Anyway, rant over. </rant>

2 Likes