if we re-pull embeddings for smaller vector length model ada002 (256) are we able to also use the embedding pulled from the larger model released recently that is text-embedding-3(small/large)? Is there mapping between small & large model?
the new text embedding models work completely different to ada and are absolutely incompatible. You’ll need to re-embed
I could imagine that if you had an ada DB and a new text embed DB you could potentially fuse the results (although you’d still need to remap the scores).
I’ve just done a migration from gen 2 to gen 3 for two of my Discourse Forums … some 150k embeddings … costing me well under a dollar … not so bad! Just took a while!
If only GPT 4 was that cheap!