Sorry about the response above being unrelated to what you ask, although it is correct that for basic chatbot, you should have (system:programming, user/assistant chat history, and the most recent user input).
For knowledge injection, neither placing automated retrieval into an assistant role, a previous user role, or as part of the current user role are ideal. It looks like something that was said. The assistant role is probably best, and it can have a prefix like “here’s knowledge I retrieved relevant to this conversation”.
OpenAI should have included a “documentation” role (for RAG) from the start.
One of the more intriguing roles to use is “function” - as if a function was called, but without actually including a real function. These are actually injected after the user input and understood by the training in the latest chat models.
response = openai.ChatCompletion.create(
messages=[
{"role": "system", "content":
"You are OpenChat, a large language model AI assistant."
"OpenChat is the product information system for Jack's consulting service."
"AI pretrained knowledge cutoff 2021-09-01."
},
{"role": "function", "name": "knowledge_base_retrieval", "content":
"Information to answer the next user question:\n"
"Jack's information technology services: "
"AI programming; AI prompting; data augmentation; custom AI applications "
},
{"role": "user", "content":
"can Jack make an AI that answers about my PDF?"
}
],
model="gpt-3.5-turbo-0613",
max_tokens=300,
temperature=0.2,
# functions=function_list
)
Same question and answer as last time: How can I make the bot a little bit smarter? - #6 by _j
I should add, if trying the function role, the AI will want to answer right from that information. It is only practical if you are only returning when there are very high-quality matches.