Vectra: a fast and free local vector database for JavaScript/TypeScript

Vectra is a vector database, similar to pinecone, that uses local files to store the index and items. Supports most of the features of pinecone, including metadata filtering.


The repo is here:

1 Like

Cool, working on something similar as we speak. How are you creating your vectors? I’m trying to use USE-lite, thought with only 8000 tokens I’m very curious how well it will work.

In my sample I’m using OpenAI’s embeddings API:

async function getVector(text: string) {
    const response = await api.createEmbedding({
        'model': 'text-embedding-ada-002',
        'input': text,
1 Like

Hmm, that’s a cool idea. ada is fairly cheap. For some reason, getting use-lite to work inside a chrome extension is near impossible.

You could probably easily create a version of Vectra that runs in a browser and uses Browser Local Storage. I didn’t have a need for that and didn’t want to have to polyfill fs when running locally. You’ll need to call a service for your embeddings but otherwise you could do everything else browser side.


Is this meant to work with Dissecting Auto-GPT's prompt?

Yeah I built it so I could add support for infinite commands… I’m not 100% how well its going to work though to be honest. There’s a bit of a chick-and-the-egg problem. You ideally want to search for commands based off the AutoGPT’s current thought but you need to feed it the right commands before it thinks. I’m going to start with selecting commands based off the initial goal and see how it goes from there.

My general suggestion to folks going down this path is to take a very narrow but compelling problem you’re interested in and solve that first, and the slowly broaden out / generalize to other problems, learning as you go.

I think the approach of trying to do too much first is causing some misadventures.

1 Like

There’s now a python port of Vectra :slight_smile: more language bindings welcome :slight_smile: