Imagine you have a company handbook and want to use openai to make a FAQ bot working on the handbook. So colleagues could ask any question and the bot will give you the answer according to the handbook. How would I do this? Prompting the whole handbook is not possible. Fine tuning completions needs too much data, I think. Any idea maybe? Really hope to get some Exchange here.
It’s a great question, and while it seems straightforward, there could be several ways to create an ideal solution that is also financially practical. There’s an operating cost for every AI solution, so best to factor these in early in the requirements phase.
@PaulBellow referenced a really good approach using embeddings. I’ve built three KM systems using this exact approach, and embeddings have worked well. I’ve also built a few GPT Q&A systems for personal knowledge management that interoperate at the OS level, making it possible for the solution to work in every app context. More about that here.
To implement company data, begin creating a knowledge base, then use the system and assistant role as shown in this notebook :
Thank you. One question. As I submit fixed texts by roles. How can I make openai find automatically the right answers and maybe also answers according to all the submitted information, I did not yet add the fitting question for
This might help.
You have pointed out THE critical aspect of implementing conversational AI!
There are three aspects of implementation:
- Project goal. Knowing exactly what we expect our AI to do.
- Designing a good prompt dataset to make sure that when a person uses the AI, the application knows the answers.
- And now the most difficult part you have pointed out:
a dataset for a project needs two columns (at least):
the prompt, the response
To achieve expertise on this issue, OpenAI 's API now contains a system message and an assistant message on top of the user message.
For more, you can read and run the following notebook I shared on GitHub.