Why does the gpt3 model is not able to learn context with samples like seq2seq model based sample? Can’t we train with seq2seq inputs and as questions in prompt?
Related Topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How closely does my training data need to match my prompt sequencing for Fine-tuning to be effective? | 7 | 287 | February 6, 2024 | |
GPT-3 SQuAD reading comprehension prompt? | 1 | 668 | October 4, 2021 | |
Train on top of existent GPT-3 model | 0 | 144 | March 24, 2023 | |
Training GPTs through uploaded docs | 8 | 910 | February 29, 2024 | |
How can I generate a prompt that ChatGPT respects and doesn't hallucinate? | 2 | 1428 | December 19, 2023 |