Looks like that tutorial is using LangChain. Know that you aren’t “training” the LLM though. Try running LangChain in verbose mode and you can see the prompts it is using and how it stuffs your local data in the prompt. Under the hood it is using embeddings/vector storage, here’s an explainer from another community member:
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Replicate langchain custom data query with openai python API? | 3 | 2469 | October 11, 2023 | |
Creating a support chat bot for my business | 4 | 3616 | December 18, 2023 | |
Seeking Advice: Uploading Large PDFs for Analysis with GPT-3 API | 7 | 6972 | December 13, 2023 | |
Question - Chatbot using your own data? | 16 | 12912 | August 13, 2024 | |
Expand AI Context beyound local documentation | 5 | 773 | January 19, 2024 |