What are the top 3 pain points for plugin developers?

I have both a dev and prod version deployed. So I can update the dev version and install it as an unverified plugin. If that works then I can deploy to production.

I also have a set of simple integration tests that run through a set of calls against the API to check it functions.

1 Like

Hi
I am a kurd and it is dificalt for me to undrestand your text , pleas help us (kurdish pepole) with our longuag.
We are 50_70 milion pepole and world is very hard for us.
However , l have difficalt to sigh in chatgbt.

  1. Lack of documented good practices on how to design services and APIs for plugins. I would like to read how to deal with problems that affect every plugin developer: too large returned responses, correct API documentation.
  2. No detailed error information - you have to guess where the error is
  3. I don’t know hot to purge openapi.yaml fetched by OpenAI
1 Like

Since Chrome seems to be the only browser that always works correctly, are using Chrome’s developer tools by pressing F12?

Are you also Open plugin devtools?

image

Also clicking on the drop down errors with some feedback messages also reveals useful information.

Starting the HTTP server in debug mode helps. Can’t be more specific because each server and programming language have different means/options.

Open plugin devtools will create a panel on the right of the ChatGPT conversation. It also includes a Refresh plugin button.

image

1 Like

Since Chrome seems to be the only browser that always works correctly, are using Chrome’s developer tools by pressing

Yes, I investigated network traffic and read OpenAI API responses directly. It didn’t provide me more details to me issues.

Are you also Open plugin devtools?

No, I didn’ noticed it! It looks great, provided additional information and the possiblity to refresh plugin. That’s what I wanted, thank you.
Side note: I feel it should be enabled by default every time some creates own plugin. It’s too easy to miss the feature right now.

1 Like

Actually, if you’re using www.pluginlab.ai you can just create a second plugin and we provide you a new plugin domain you can develop with.

Our clients often have a dev and production plugin on their account :slight_smile:

Plugin Developer here :+1:
It’s really easy to create and develop these plugins, once you realize that you just have to read the documentation. They are just APIs, that’s it.
Most of my Webservices already were APIs, so you just have to provide the ai-plugin.json (aka the “reason” of this plugin) and some sort of openapi.yaml (the nice thing is, you decide what capabilities the plugin has, by excluding and including endpoints).
Setting CORS up properly is standard with RESTful APIs.

You can even granularly set up the authentification, which is pretty nice :wink:
The Retrieval Plugin is a Gods Gift!

Thank you OpenAi, that you contribute to the open source community in such a big way :heart::heart:

1 Like

I put the plugin files up on GitHub

As you can see I don’t do much with GitHub repos so hopefully everything you need is there. Let me know if you have problems getting it to run and I should be able to help.

The current design of the plugin is solely for compiling Prolog code using SWI-Prolog, and it does not include a feature to run the code. However, there are plans to potentially add this functionality in the future. It’s important to note that running Prolog code outside of a sandbox environment on an HTTP server can pose significant risks to the system running the HTTP server.

On the other hand, Prolog code that is restricted to run only within a sandbox can often be so limited that it hampers the development of production-grade code. This is because the plugin would frequently fail to compile such code, as it would not pass the sandbox checks during the compilation process.

Nevertheless, if the plugin is installed and operated by someone using localhost, running all Prolog programs should be permissible. This is because the risks associated with running Prolog code outside of a sandbox are significantly limited to just the localhost environment.

1 Like

Are you still working on #3? I’d be happy to help out.

Thanks.

No.

I created as a prototype to see if it would work and does. I don’t even use it anymore, ChatGPT does what I need.

Got it.

Do you talk to gpt from swipl, and if so, do you use prolog2gpt or something similar?

The code is on GitHub.

Don’t expect much, it was simply a proof of concept.

A ChatGPT plugin that will validate Prolog code generated by ChatGPT by compiling the code with SWI-Prolog.

I have not used it in months and don’t even know if it will still work with changes to ChatGPT plugin requirements.

Thank you – interesting. But I meant independently of your plugin, since you work in Prolog (I envy you that), do you talk to GPT from swipl? Or do you just use GPT to help write Prolog code?