GPT3 vs SBERT for semantic search/similarity?

Embeddings have been out for a while now and I wanted to get some external opinions on semantic search.

There have been a lot of discussions on Twitter and a few ML/AI-related youtube podcasts about the performance of GPT3 vs more standard semantic search specific models such as SBERT and Slade.

For a query to document search (let’s say an input text query that needs to return something from a Google Doc or Notion Page) what would you guys go with?

3 Likes

Text-embedding-ada-002 works quite well for the projects I’m working on. My only complaint is that it took me a long time to process the data. I don’t have any experience yet with SBERT or Slade. Other people recommended me to use Cohere.

I’ll try some of those later, but for now ada-002 works for me. If you found a better way, please let me know.

1 Like