I think the major one that I have faced till now is Submission of plugin. As for people with different UTC, won’t be able to rectify the plugin error at the moment. They have to wait for 1- 2 days to get reviewed.
Another is while facing any error or bugs, its difficult to know that is from developer side or from OpenAI, like some days ago I faced Unverified Plugin was taking forever to respond.
But all is not that bad , I believe this is next phase of development and we don’t get diamonds without handling pressure.
How do you test your plugin before submitting it? I’m particularly curious when devs have a plugin in production but they want to develop v2 of the plugin. Are you creating different endpoints and then hosting v2 versions of the manifest file somewhere?
So if I understand correctly
- You have a remote server with the API endpoints
- You have the manifest and spec files hosted remotely
- You use curl to test the endpoints and to make sure that the manifest and spec files are returning
- You then open the dev tools to look at what is going on as you test out through the ChatGPT UI
Do you use the local plugin for dev or testing? When you are developing a new version, how do you deploy and test without touching the production version that users are using?
You have a remote server with the API endpoints
Yes, it is started as an HTTP server from an SWI-Prolog command prompt and runs on the same machine for developing the plugin.
Click to expand
Welcome to SWI-Prolog (threaded, 64 bits, version 9.1.8)
?- working_directory(_,'C:/Users/Groot/SWI-Prolog - ChatGPT plugin (localhost)').
?- ['ChatGPT plugin Todo'].
% Started server at http://localhost:5003/
You have the manifest and spec files hosted remotely
No they are on the same machine for development. Development is done using VSCode on Windows and SWI-Prolog toplevel, think REPL.
You use curl to test the endpoints and to make sure that the manifest and spec files are returning
You then open the dev tools to look at what is going on as you test out through the ChatGPT UI
Yes. This only becomes useful once the Plugin server and ChatGPT start communicating. (Just before writing this was able to get ChatGPT to load the
openapi.yaml without errors). The dev tools validated the files and did not like the use of
- in certain fields. The
- was part of
SWI-Prolog so changed it to
_ for now.
Developer tools for Chrome F12 is also open the entire time and caching is disabled.
I don’t have any production versions at the moment. That may happen down the road but will need to shop the idea around and see if it floats, for a hint of an SWI-Prolog server running full time online see SWISH.
Cool, thanks for the reply. I’ve done Prolog programming in the not-too-distant past, if you need any testers let me know.
Thanks, I’ll kept that in mind.
If you want to hang out with those using SWI-Prolog just join the SWI-Prolog Discourse forum, its free and there are never any ads, imagine that.
I have both a dev and prod version deployed. So I can update the dev version and install it as an unverified plugin. If that works then I can deploy to production.
I also have a set of simple integration tests that run through a set of calls against the API to check it functions.
I am a kurd and it is dificalt for me to undrestand your text , pleas help us (kurdish pepole) with our longuag.
We are 50_70 milion pepole and world is very hard for us.
However , l have difficalt to sigh in chatgbt.
Since Chrome seems to be the only browser that always works correctly, are using Chrome’s developer tools by pressing F12?
Are you also
Open plugin devtools?
Also clicking on the drop down errors with some feedback messages also reveals useful information.
Starting the HTTP server in debug mode helps. Can’t be more specific because each server and programming language have different means/options.
Open plugin devtools will create a panel on the right of the ChatGPT conversation. It also includes a
Refresh plugin button.
Since Chrome seems to be the only browser that always works correctly, are using Chrome’s developer tools by pressing
Yes, I investigated network traffic and read OpenAI API responses directly. It didn’t provide me more details to me issues.
Are you also Open plugin devtools?
No, I didn’ noticed it! It looks great, provided additional information and the possiblity to refresh plugin. That’s what I wanted, thank you.
Side note: I feel it should be enabled by default every time some creates own plugin. It’s too easy to miss the feature right now.
Actually, if you’re using www.pluginlab.ai you can just create a second plugin and we provide you a new plugin domain you can develop with.
Our clients often have a dev and production plugin on their account
Plugin Developer here
It’s really easy to create and develop these plugins, once you realize that you just have to read the documentation. They are just APIs, that’s it.
Most of my Webservices already were APIs, so you just have to provide the ai-plugin.json (aka the “reason” of this plugin) and some sort of openapi.yaml (the nice thing is, you decide what capabilities the plugin has, by excluding and including endpoints).
Setting CORS up properly is standard with RESTful APIs.
You can even granularly set up the authentification, which is pretty nice
The Retrieval Plugin is a Gods Gift!
Thank you OpenAi, that you contribute to the open source community in such a big way
I put the plugin files up on GitHub
As you can see I don’t do much with GitHub repos so hopefully everything you need is there. Let me know if you have problems getting it to run and I should be able to help.
The current design of the plugin is solely for compiling Prolog code using SWI-Prolog, and it does not include a feature to run the code. However, there are plans to potentially add this functionality in the future. It’s important to note that running Prolog code outside of a sandbox environment on an HTTP server can pose significant risks to the system running the HTTP server.
On the other hand, Prolog code that is restricted to run only within a sandbox can often be so limited that it hampers the development of production-grade code. This is because the plugin would frequently fail to compile such code, as it would not pass the sandbox checks during the compilation process.
Nevertheless, if the plugin is installed and operated by someone using localhost, running all Prolog programs should be permissible. This is because the risks associated with running Prolog code outside of a sandbox are significantly limited to just the localhost environment.
Are you still working on #3? I’d be happy to help out.
I created as a prototype to see if it would work and does. I don’t even use it anymore, ChatGPT does what I need.
Do you talk to gpt from swipl, and if so, do you use
prolog2gpt or something similar?
The code is on GitHub.
Don’t expect much, it was simply a proof of concept.
A ChatGPT plugin that will validate Prolog code generated by ChatGPT by compiling the code with SWI-Prolog.
I have not used it in months and don’t even know if it will still work with changes to ChatGPT plugin requirements.
Thank you – interesting. But I meant independently of your plugin, since you work in Prolog (I envy you that), do you talk to GPT from swipl? Or do you just use GPT to help write Prolog code?