I began developing a plugin but shifted focus to creating a GPT for the OpenAI GPT store. Now, I’m considering adapting this GPT for use in Bing Chat and Copilot. Has anyone here experienced a similar transition? I’d love to hear about your journey, particularly any challenges or insights you encountered in making your plugin compatible with Bing Chat and Copilot.
That’s great to hear!
One of the things to keep in mind is that you can’t really create a custom GPT for use outside of ChatGPT. So, if you were thinking of making this transition, you would need to replicate it using the API, which is a very active part of this forum, and why so many folks do end up wanting to go deeper and develop their own tool like this .
As @Macha points out, the building of a custom GPT within the ChatGPT environment versus a plugin/bot for Copilot are very different projects.
If you want to create the bot locally, you can utilize the Bot Framework SDK.
For integration into Copilot, my team uses the Azure Bot resource for development/testing/publishing and Graph API for integration with 365.
You can search the Microsoft documentation for specifics to determine if those options fit with your use case.
@BPS_Software & @Macha It’s helpful (and motivating) to know others have walked this path! I’ve integrated Actions into my GPT, connecting it with my API through a Swagger definition. With a Copilot plugin is it possible to reuse the YAML directly?
So, Copilot, while it does use OpenAI’s models, is a Microsoft product, not an OpenAI one. So, in terms of answering questions about Copilot here, the knowledge is gonna be a bit more sparse, just as an FYI.
I interpreted what you were asking for as more “Should I make the jump to evolve my custom GPT into other stuff?” Which is common. Specifically related to Bing Chat or Copilot Chat plugins, I’m unsure. If I misunderstood that that’s my bad.
Yes, you can reuse the YAML from your Swagger definition directly with a Copilot plugin for integrating actions into your GPT. The Copilot plugin essentially acts as an interface between your GPT model and the external APIs.