Finetune model on context without using prompt and completion

I want to Finetune model using my knowledge base as context. So it can answer from that context without sending them all the time but i don’t have “prompt” and “completion” to finetune.

See this case study. You can just put all your context into completions,and leave prompts empty. OpenAI API

5 Likes

Hi Boris, the link leads nowhere. Where can I find the case study?

Check use case section heading “Creating an expert model in the legal domain which understands internal company jargon” on this link OpenAI API

2 Likes

Is this still a suitable method? Or should we use embeddings like here? I ask because I can no longer find the case study mentioned.

1 Like

Hi, for me personally the embeddings is the way to go.

3 Likes

Providing context in the prompt generates the best result for me.

1 Like

That should work if the context is small.
The problem is if you want to analyze a novel, for example.
In this case the embeddings would be the right way to do it.

But there are cases when you don’t know the full context. Or you want to add more context.
There are many circumstances to be taken in consideration.

2 Likes