On the verge of a breakthrough

I’ve almost got this expert chatbot thing cracked… I am really on the verge of a major breakthrough. Tested my idea on the playground and it looks like it’s going to work! I just have one little problem. I can’t seem to get the dot product of the embeddings. Someone please help me out so I can tell my boss I’m a fraggin’ genius (with all due credit given to whoever answers this question. So here’s my code:

function begin(){

    
    for(let q = 0; q < questions.length; q++) {
        let question = questions[q];
        let answers = [];
        for(let s = 0; s < sections.length; s++) {
            let section = sections[s];

            let qtensor = Array(question.embedding);
            //qtensor = tf.tensor(qtensor);
            let stensor = Array(section.embedding);
            //stensor = tf.tensor(Array(stensor));

            let answer = new Answer();
            answer.score = tf.dot(qtensor,stensor);
            answer.title = section.title;
            answers.push(answer);
        }
        answers.sort((a,b) => a.score - b.score);
        console.log("Q: " + q.text);
        console.log("A: " + answers[0].title);
    }
    console.log("Done!");
}

I know I’m close here… what am I doing wrong?

1 Like

Try putting your code into the codex edit feature, and instructing it to do what you wanted to do.

Have you looked at cosine_similarity API?

I had to transpose the section embedding. And then after another bug here and there, I got what I was hoping for. But still not. But hey, on to the next version number.