I think the major one that I have faced till now is Submission of plugin. As for people with different UTC, won’t be able to rectify the plugin error at the moment. They have to wait for 1- 2 days to get reviewed.
Another is while facing any error or bugs, its difficult to know that is from developer side or from OpenAI, like some days ago I faced Unverified Plugin was taking forever to respond.
But all is not that bad , I believe this is next phase of development and we don’t get diamonds without handling pressure.
How do you test your plugin before submitting it? I’m particularly curious when devs have a plugin in production but they want to develop v2 of the plugin. Are you creating different endpoints and then hosting v2 versions of the manifest file somewhere?
You have the manifest and spec files hosted remotely
You use curl to test the endpoints and to make sure that the manifest and spec files are returning
You then open the dev tools to look at what is going on as you test out through the ChatGPT UI
Do you use the local plugin for dev or testing? When you are developing a new version, how do you deploy and test without touching the production version that users are using?
Yes, it is started as an HTTP server from an SWI-Prolog command prompt and runs on the same machine for developing the plugin.
Click to expand
Welcome to SWI-Prolog (threaded, 64 bits, version 9.1.8)
...
?- working_directory(_,'C:/Users/Groot/SWI-Prolog - ChatGPT plugin (localhost)').
true.
?- ['ChatGPT plugin Todo'].
% Started server at http://localhost:5003/
true.
You have the manifest and spec files hosted remotely
No they are on the same machine for development. Development is done using VSCode on Windows and SWI-Prolog toplevel, think REPL.
You use curl to test the endpoints and to make sure that the manifest and spec files are returning
Yes
You then open the dev tools to look at what is going on as you test out through the ChatGPT UI
Yes. This only becomes useful once the Plugin server and ChatGPT start communicating. (Just before writing this was able to get ChatGPT to load the .well-known\ai-plugin.json and openapi.yaml without errors). The dev tools validated the files and did not like the use of - in certain fields. The - was part of SWI-Prolog so changed it to _ for now.
Developer tools for Chrome F12 is also open the entire time and caching is disabled.
Both.
Good question.
I don’t have any production versions at the moment. That may happen down the road but will need to shop the idea around and see if it floats, for a hint of an SWI-Prolog server running full time online see SWISH.
I have both a dev and prod version deployed. So I can update the dev version and install it as an unverified plugin. If that works then I can deploy to production.
I also have a set of simple integration tests that run through a set of calls against the API to check it functions.
Hi
I am a kurd and it is dificalt for me to undrestand your text , pleas help us (kurdish pepole) with our longuag.
We are 50_70 milion pepole and world is very hard for us.
However , l have difficalt to sigh in chatgbt.
Lack of documented good practices on how to design services and APIs for plugins. I would like to read how to deal with problems that affect every plugin developer: too large returned responses, correct API documentation.
No detailed error information - you have to guess where the error is
I don’t know hot to purge openapi.yaml fetched by OpenAI
Since Chrome seems to be the only browser that always works correctly, are using Chrome’s developer tools by pressing
Yes, I investigated network traffic and read OpenAI API responses directly. It didn’t provide me more details to me issues.
Are you also Open plugin devtools?
No, I didn’ noticed it! It looks great, provided additional information and the possiblity to refresh plugin. That’s what I wanted, thank you.
Side note: I feel it should be enabled by default every time some creates own plugin. It’s too easy to miss the feature right now.
Plugin Developer here
It’s really easy to create and develop these plugins, once you realize that you just have to read the documentation. They are just APIs, that’s it.
Most of my Webservices already were APIs, so you just have to provide the ai-plugin.json (aka the “reason” of this plugin) and some sort of openapi.yaml (the nice thing is, you decide what capabilities the plugin has, by excluding and including endpoints).
Setting CORS up properly is standard with RESTful APIs.
You can even granularly set up the authentification, which is pretty nice
The Retrieval Plugin is a Gods Gift!
Thank you OpenAi, that you contribute to the open source community in such a big way
As you can see I don’t do much with GitHub repos so hopefully everything you need is there. Let me know if you have problems getting it to run and I should be able to help.
The current design of the plugin is solely for compiling Prolog code using SWI-Prolog, and it does not include a feature to run the code. However, there are plans to potentially add this functionality in the future. It’s important to note that running Prolog code outside of a sandbox environment on an HTTP server can pose significant risks to the system running the HTTP server.
On the other hand, Prolog code that is restricted to run only within a sandbox can often be so limited that it hampers the development of production-grade code. This is because the plugin would frequently fail to compile such code, as it would not pass the sandbox checks during the compilation process.
Nevertheless, if the plugin is installed and operated by someone using localhost, running all Prolog programs should be permissible. This is because the risks associated with running Prolog code outside of a sandbox are significantly limited to just the localhost environment.
Thank you – interesting. But I meant independently of your plugin, since you work in Prolog (I envy you that), do you talk to GPT from swipl? Or do you just use GPT to help write Prolog code?