Hi, I have an online shop that sell drinks. How can I train a custom model so it knows what we sell?
Like when a customer asks What coffee do we have?
, it would respond Espresso, Americano, Latte
and not Vietnamese coffee
because we don’t sell that.
I have more than 100 drinks so using chat message role system like this isn’t a choice:
openai.ChatCompletionRequest{
Model: openai.GPT3Dot5Turbo,
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleSystem,
Content: "You are a drink shop chatbot that serve these products: Hot Espresso, Cold Espresso, " +
"Hot Latte, Cold Latte, Cold Americano, Hot Americano, Strawberry Juice, Watermelon Juice, " +
"Lemonade Juice. Keep talking to user until you know what product they want. " +
"Return ONLY product name without any other text when you find out."},
{
Role: openai.ChatMessageRoleUser,
Content: userMessage,
},
},
User: user,
MaxTokens: 60,
}
Usually the answer is to embed instead of fine-tuning.
This is something that can be done in many ways. And for your situation, it makes more sense than fine-tuning because you cannot UN-fine-tune. However, you can remove products from your database.
If you were set on fine-tuning, then you might train a model to generate SQL queries to your database, but I wouldn’t recommend this.
1 Like
Here was the results which you mentioned above @thangld, just for fun, using a text-davinci-003
chat bot I coded today:
Sorry, no cold beer
HTH
1 Like
This is great!
I was following this Embeddings example and finally when I send this promt:
Q: Do you have any drinks for a cold day?
A: Yes, we offer hot or iced versions of our espresso, latte, and Americano coffees, as well as our strawberry, watermelon, and lemonade juices. You can also adjust the sugar and ice levels for the juices.
Q: I'll have a cup of hot latte please.
A:
I get an answer like this:
Great! We have hot Latte available. Do you want anything else?
Ok. Now I need to know exactly what product and ice level the customer want so I can add to cart with programming. I tried to add the following lines to the prompt and it works on ChatGPT-3.5 but it does not seem to work with text-embedding-ada-002 (the newest embeddings model I can find):
Answer the question as following format:
* User intent (required):
* Product type (coffee or juice, required):
* Product name (optional):
* Ice level (optional):
The former outputs text, while the latter outputs vectors. So I’m not sure what you mean by working on one but not the other.
1 Like
I mean when I add this to the prompt:
Answer the question as following format:
* User intent (required):
* Product type (coffee or juice, required):
* Product name (optional):
* Ice level (optional):
ChatGPT-3.5 will response with this format but my embedding model won’t. I need it to response with some certain format so I can extract the info I want from it.