I have an application that is a bit counterintuitive. Instead of AI playing the role of Teacher, I want AI to act like a student ask questions to the teacher. In the prompt, I specify that the AI should never try to act like teacher and should not show any knowledge of the subject unless taught by the teacher. It usually works except sometimes when there is some background noise or for some reason Whisper fails to transcribe what the user said. In those cases, the AI starts to act like teacher/exhibit knowledge of the subject. Any thoughts on how to prevent this?
The chat AI is prompted with “user” & “assistant”, and the post-training is all about answering questions, so this would be kind of hard. On completions and 3.5-instruct
, you’d be able to make your own turns for the AI to complete:
Teacher: Do you have any questions about the trig homework?
Student: (AI writes here)
That would be best with a convincing preamble and several multi-shot turns before.
Otherwise, it is all about instructing in a system message, giving examples in the system message - and not letting the chat go on too long.
You can have an “inspector” AI examine the generation, and if the roles are reverse, add more prompting automatically to the user input and retry to ensure AI adheres to the student role.
Frankly, I don’t think the AI has much in the way of questions to ask. You might want to have a complete persona description of who the AI is depicting.
We give user instructions to the AI to act like student and it does.