Couple the GPT 3.5 Turbo model with an external data source

We are currently developing an online comparison portal that uses GPT 3.5 Turbo to interact with the user and suggest the best products.

Unfortunately our list of product catalog and information is too long to fit in the system prompt.

Is there a solution to couple the GPT 3.5 Turbo model with an external data source so that it is not necessary to have all the information in the system prompt?

Kind regards

Welcome to the forum.

Do a quick search on RAG (Retrieval Augmented Generation) and embedding… lots of good stuff.

Hope you stick around. We’ve got a great community growing.

Hello

Yes, instead of embedding the entire product catalog into the system prompt, you can develop an interface that links GPT 3.5 Turbo with an external data source. This interface can query the external source to retrieve the relevant information and present it to the user as needed. In this way, you don’t have to load all the data into the model’s prompt.

You can use function

Alternatively, you can also train the GPT-3 model on your data, which in my opinion, might be a more cost-effective solution. In this way, you don’t have to load all the data into the model’s prompt.

1 Like