Let me know if you do. It’s important to differentiate different kinds of memory. One type is episodic recall, while another is declarative knowledge. Declarative knowledge can be baked into GPT-3 via fine-tuning. It’s possible that episodic knowledge can as well, but I doubt it will be high quality. Until they iron out problems like confabulation, fine-tuning will not be a reliable way to record this kind of memory.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Episodic and declarative memory should probably be separate in AGI | 12 | 1457 | January 12, 2022 | |
Discussion thread for "Foundational must read GPT/LLM papers" | 75 | 10565 | September 3, 2024 | |
The Elephant in the Room: Why No Persistent Conversational Memory in LLMs? | 57 | 1428 | March 8, 2025 | |
What Ontology, RAG and Graph data do you use to develop Intelligent Assistants? | 42 | 5519 | September 26, 2024 | |
Reverse Engineer: creative answers through step-by-step in reverse | 18 | 671 | August 31, 2024 |