Training Gpt on Seq2Seq model based samples

Why does the gpt3 model is not able to learn context with samples like seq2seq model based sample? Can’t we train with seq2seq inputs and as questions in prompt?