A neat YouTube video with David Orban and Stephen Wolfram that just came out.
ChatGPT, AI, and AGI with Stephen Wolfram https://www.youtube.com/live/szxiPMyuMGY via @davidorban @stephen_wolfram
A neat YouTube video with David Orban and Stephen Wolfram that just came out.
ChatGPT, AI, and AGI with Stephen Wolfram https://www.youtube.com/live/szxiPMyuMGY via @davidorban @stephen_wolfram
Guys, it has been super exhausting, but look at the opportunities!
99% of people are ignorant to the “API/plugin/OpenAI power”.
We don’t need to invent google to be successful, focus on a use case and commit to it. There’s no sense in chasing tails or trying to create the perfect product.
Still, some public roadmap would be excellent.
Would there be a way to make a template for plugins that has stripe/venmo/cashapp/paypal/applepay prebuilt on vercel with the openai api setup.
This would enable developers to charge pay-per-use for extremely intricate and helpful plugins, and acknowledge apps are going to come and go so let the devs get paid and incentivized to make more stuff which will accelerate the types of things we build as people can go straight to sophisticated uses with a true template that can help fund the dev.
Pay per use minimizes up front cost to users who may hesitate to buy credits or sign up for a monthly subscription. More use = faster evolution toward seeing what these multi prompts and new forms of access can really do.
Thank You so much for sharing we will wait
The evidence shows there are issues with building suitable and successful systems, but it’s also showing it’s not absolutely useless. Ironically, nothing about AI is absolute. ![]()
I feel ya man. ![]()
This risk is not new. Since the early 80’s, the debate over [seemingly] “sacred” territory by aftermarket providers has raged on. Companies insensitive to the delicate balance and advantages of having partners soon find themselves without partners. And aftermarket providers aloof to free market forces soon find themselves without customers. It requires great skill and agility to thrive in the aftermarket of any industry.
That’s the skill part - you need to forecast probable moves by your competitors and infrastructure providers. For any given idea or approach, you must weigh the likelihood that your disruptive or cool idea won’t be displaced by others who might do it better. You are not competing only with the infrastructure providers; peers in your aftermarket want to eat your lunch as well.
Known unknowns are rife in any segment where magical advancements are being made at an increasing pace. This was true in 1980 when Jobs and Wozniak were changing the landscape. It was a climate of vastly unknown outcomes. Today is no different.
During those years, I placed a bet that no single communications protocol would emerge soon that worked with all personal computers. I built LapLink, which thrived for almost a decade until Ethernet killed it. Sure, it still exists today, but not in its original spirit. We helped ~250 million users move information between desktops and laptops, and then it was wholly replaced by something better.
Today, I’m putting a lot of trust in one very simple idea - the compression of time and the skilled effort required to bind the power of LLMs with deeply specific information about work. Big players will make similar investments, and one (or many) may end my investment overnight. However, I’m betting big players won’t see what I see - at least not initially for the “S” in SMB.
A good example is the segment being explored by ChatPDF and CustomGPT. I’m putting a lot of effort into one of these players because only one has a vision that aligns with my own. But I’m also clear-eyed about this investment. It could end in ashes because I’m actually an aftermarket supporter two levels of abstraction from the infrastructure providers - e.g., very high risk indeed.
LLMs are designed like the blood-brain barrier in most creatures; they are isolated because trained models are based on a mix of lots of energy plus lots of silicon. They take massive amounts of power and assimilation to become intelligent. This architecture will exist for a while. As such, the only reasonable pathway to create dynamic interaction between LLMs and real-time data is by interfacing with external information systems.
Plugins are that interface, but building plugins in a no-code (LLM) fashion is very different from building the pipeline’s last mile. Something has to glue the circulatory system with the brain. It’s not surprising to see Zapier on the short list of partners; it has the integration adhesive to about five-thousand endpoints and OpenAI needs those endpoints. It can certainly train its LLMs to build apps to those endpoints over time, but an additional element is needed - business logic.
This is where you, me, and millions of developers and domain experts have a slight advantage concerning plugins.
Indeed. And without this symbIotic collaboration, plugins would not work well. This is where aftermarket opportunities exist and will likely exist for about a decade. This real-world → LLM-world context-building requirement is critical to making it all work.
You’re right
The morning of the release I was just finishing an app using embedding, in a few hours it became completely obsolete with the announcement
There certainly is a lot of confusion with the term “developer” in the announcement. To make it clearer, they should have said SaaS developers who are interested in exposing their software to an LLM front-end, in particular, the OpenAI implementation of ChatGPT, not the gpt-3.5-turbo or gpt-4 models that “developers”, the other ones, that aren’t necessarily SaaS, can access.
But it makes sense, and is basically a fractal of what Zapier did … attract SaaS companies to a low/no-code platform in hopes of more SaaS revenue $$$$. So this business model works!
I’m interested to see where this goes. One of the big hurdles for fine-tunes was a lack of trustworthy seq-to-seq instructions (Words → API calls). So if they can nail this seq-to-seq problem with plugins, this is a huge leap forward.
But as a non-SaaS developer, if they could expose the core model that translates Words → API instructions, this would be a lot bigger deal for me, since this is the core technology offering in the announcement.
Very good points, thank you for your responses.
This has been quite the ride. Looking forward to the future.
For sure. You see that already with Google and Microsoft incorporating GPT powers in their office suites, completely wrecking hundreds of early entry apps and extensions people came up with.
But it was as predictable as it is necessary.
Small startups and devs have to stop jumping on the obvious use cases, because the big players have both the access and the resources to cover those themselves much better than an individual dev can do at home after their day job.
Instead they need to look for niche and proprietary applications of the technology. E.g. don’t create a generic chatbot, for christ’s sake! Instead create a conversational interface to a specific company’s internal knowledgebase or processes and documentation.
Or hook into a specific special-usecase business application and automate something in there with a plugin or a connection to a text model of some kind… something that MS or Google won’t railroad you on within a few weeks.
Absolutely!
I’ve been using drinkwater.ai for some weeks and this API features is there from the start.
And the knowledge base with semântic search too.
Not only that, but also the case with these gradual rollouts … Even the newest features I can’t incorporate into my own development, such as access to GPT-4 API which I still didn’t get although I am on the waitlist.
So other insiders not only get the knowledge of the roadmap you mentioned, but also get access before the release and after the release for a long time … and that is a serious disadvantage to all the community.
I must admit with the plugins release they killed a bit some of my ideas and some of the work I am doing… but it was an obvious market move they could make.
As someone mentioned, commit to a use case and focus on solving and adapt, but at least a roadmap would be cool. I am interested at the moment if e.g. these plugins will be accessible over the API and not only through ChatGPT? Info like that would be helpful at least…
If it was obvious, maybe that’s not where you should have placed a bet.
Sometimes we do the obvious for different reasons:
If you can think it up, 100 other people thought it up the week before you did. And at least one of the one hundred is the core infrastructure provider.
No business is obligated to expose its strategic plans.
They won’t. The entire purpose of plugins is to enrich GPT services, not the other way around. Creating plugins to serve as endpoints to your enrichments is not their objective. Rather, creating better solutions that utilize their API is precisely their objective.
Can y’all OpenAI folks give any indication how quickly people from the waitlist are being added? Is it going at a good clip or more of a slow roll?
Can you clarify that? I am not sure I understand the logic. If for example Zapier creates a plugin to extend the capability of ChatGPT, do you think the Zapier integration will be available only within ChatGPT or also in an API?
Zapier will still live, and ChatGPT will use Zapier. But API developers will use the raw ingredients we’ve always been afforded and now SaaS companies can get in the mix directly to OpenAI without Zapier or other “SaaS Hubs”.
The more the merrier!
That’s already the case.
Nope. It’s already available in five thousand other apps and services.
Zapier is like an adhesive; it’s designed to glue stuff together. Lots of stuff. And they’ve been invited behind the velvet rope to run seamlessly inside ChatGPT. Every recipe in the Zapier ecosystem will be accessible through plugins that will blossom in abundance because they will require no code. As a result, Zapier will be instrumental in breaking the bounds of static LLMs by making access to information that is not in the LLM.
Zapier has always been accessible through its API. I anticipate OpenAIs API will provide plug-in methods, but not specifically anything related to Zapier itself.
I am blocked from ChatGPT because I purchased ChatGPTPlus, which blockes you from ChatGPT.
This is what ChatGPT Plugins are. It’s a way for Software as a Service (SaaS) companies to integrate their services with OpenAI without Zapier or anything else. How did Zapier come about? The same way, they (the many SaaS companies) developed their “plugin” for Zapier. It’s the same thing! Only difference is that the interface is through ChatGPT, not generic HTTP web pages.
Here is the development pipeline for a SaaS developer to build their offering into Zapier.
So AI is now “smart” enough to interpret your words, and create these button mashes and drop-downs for you.
Like I said, Words → API calls (ChatGPT Plugins), instead of Buttons/Dropdowns → API calls (in the case of Zapier). And in the case of API users/developers, it’s Lines_Of_Code → API calls.
So “software” is getting more and more abstracted and democratized (but ironically, emanating from a central point). But now with LLM’s, it is becoming more natural and much more abstracted. I wonder, if in 50 years, folks will look at Zapier and think it is Assembly Code. Software over the years has become so abstracted. This is the obvious trend.
And I know, I started coding with punch cards! (??? pre-Assembly ???)
Totally agree. On the LLM side, the abstractions will be far more productive. Zapier’s buttons and connectors often make convoluted Goldberg machines look elegant.
But we need to accept the reality that there is now evolving (with the help of LLMs) two sides to every Zapier recipe; the one inside ChatGPT and the other end of the recipe. Zapier will still be the glue sandwich; on the LLM side - simple words will connect to the integration workflow, which must then connect to something that will still require discrete configurations with buttons, and Zapier skills, etc.
Are you saying that GPT Plugins will essentially handle both ends of the integration recipe? If so, doesn’t that mean the days are numbered for Zapier altogether?
I’ve always felt uneasy about the middle layer that Zapier and Make provide. It increases attack surfaces, exposes your content and API keys to yet another SaaS provider, adds to the complexities of integration, and generally forces builders to push business logic into places it should probably not exist.
Elegant summation.
They will. And they will wonder how we got anything done.