Hey folks, I’d like to introduce the thing I’ve been working on for the last two years. As you might have guessed, it’s a Sublime Text AI assistance plugin, which is to be fair not that hard to achieve, and I claim that it’s not just the best-made and feature-rich AI assistant across the ST ecosystem, but it’s almost on par with Zed’s or even Cursor’s capabilities, all with zero funding so far (I’m aware that’s a disadvantage for now).
So, thanks to the o1 release, we now know that at least some people at OpenAI have good taste because they’re using Sublime Text in their work. This was the final push that made me decide to promote my work. It’s not just me, but 4.6k installs to date that think it’s now an indispensable feature, and I think it’d be a real shame if the folks actually developing OpenAI in Sublime Text weren’t aware of the OpenAI Completion plugin for Sublime Text.
So let me briefly introduce an overview of the most useful features of the plugin (you can read more details in the readme which I can’t attach post here yet):
-
The first goal I’ve tried to achieve with this plugin is to make the AI assistant as deeply and seamlessly integrated into ST as possible. The philosophy behind this is: “If two great tools already exist, just integrate them well rather than reinventing the wheel.” Here’s what that means:
- The chat with the assistant benefits from everything ST provides:
- Full-text search, symbol list navigation, first-class markdown with support for injected code snippets, and much more.
- Users can select any text or tab (referred to as “Sheet” in ST) and pass its content to the assistant as additional context.
- Users can pass images to the assistant.
- For any request, users can add an additional command or just skip it if the provided content is self-explanatory.
- Users can use in-buffer overlays as a response UI (referred to as
phantom
mode) to manipulate with code safely within the so kind of sandbox.
- The chat with the assistant benefits from everything ST provides:
-
Second goal was to provide as much flexibility as possible. Again, the philosophy behind it was “a professional tool should not limit professionals in how they do their work; it should support them wherever they go.” So here we have:
- a completely modifiable AI assistant configuration, with every useful setting you could imagine (response streaming toggle, different output modes, custom server URL support, customizable UI for assistant details in the status bar, chat layout customization, and much more)
- connection proxying (I believe I’ve been honored with a medal in a few authoritarian countries for this particular feature)
- a UX that doesn’t force the user to stick with a single assistant throughout their session.
I think I should stop here, because even now it can barely be considered a brief overview, and I have so much more to say. I hope I’ve convinced you to give this plugin a try if you’re an ST user. I truly believe that once you try it, you’ll never want to go back to the plain old ST experience.
PS: That plugin only came to life because of you folks, who released GPT-3.5 almost two years ago. In just two days, despite having never written a plugin for ST before, I managed to implement the first MVP.