[Newbie] Fine tuning for Q&A

Hello,
Newbie here, here is an observation wrt to fine tuning;
When the data has both text data & numeric data in answers; fine tuned model tends to forget and never the answer is correct; Here is an example
Question: Which is the county that blockgroup 171150029012 belong to?
Answer: blockgroup 171150029012 belongs to Macon County;
the fine tuned model always returns correct answer; while for the question,
What is the population of blockgroup 171150029012, always returns answer incorrect answer. This is true for any numeric data that is part of answer.
Could it be that GPT model is learning only text and not numerics
Kindly guide if its correct to input factual knowledge to fine tune or should got for RAG.
Thanks.

Welcome to the dev forum @just0engg

This is a use case for RAG since you’re looking for factual responses.

Here’s some common use cases for fine-tuning.

1 Like