Proposing a Long-term Memory Mechanism for GPT Models

I have conceived a method to endow GPT models with long-term memory capabilities.

The essence of this memory lies in storing the content to be remembered on disk, allowing the GPT model to be aware of the memories it possesses, and knowing when to retrieve them.

The stored memories are not all transmitted to GPT during conversations; rather, a specific indexing method is designated, enabling GPT to search for memories in a particular manner (e.g., by time, keywords, scenarios, etc.). This might require integrating other NLP models to emulate the process of memory retrieval in the human brain.

Here are several methods for memory indexing:

  1. Core Memories: Content that profoundly influences one’s behavior or thought process (e.g., “Use polite language,” “Always consider societal implications”) is transmitted to GPT as prompts in every interaction. To prevent excessive token consumption, all “core memories” must be concise and actionable.
  2. Keyword-indexed Memories: Memories that are not always needed but are frequently accessed are stored using key terms, with retrieval initiated through keyword indexing. For instance, storing “library rules” with “library rules” as the key term, and retrieving it using keyword searches like “library|reading room rules|regulations|agreement” when necessary.
  3. Time-indexed Memories: Memories that are rarely accessed or typically retrieved based on time are stored with time as the key. Collections of memories are associated with specific dates, resembling our recollection of events on certain days. Additionally, collections of memories can be created for efficient indexing during retrieval, with common characteristics as the naming convention (e.g., “Happy Moments Collection”).

During interactions, GPT is required to categorize the dialogue content based on “Storage Method/Importance Classification.” GPT evaluates the importance of user input and categorizes it as follows:

  • General: Used for experiences, e.g., storing a trip on “2023.3.22” under the “Travel Experiences” collection.
  • Important: Summarizing core terms to store memories.
  • Core: Profoundly impactful content is summarized and stored as core memories (or replaces old core memories).

This classification should help in better understanding this memory system.

User: Mojo language was launched in May 2023, initially only usable in browser-based Jupyter Notebooks. Starting September 2023, a Linux native version became available. The team at Modular also developed a Visual Studio Code extension for Mojo language. Mojo, developed by Modular, is a programming language based on the MLIR compiler framework, aimed at providing a unified programming framework for AI and related software development fields.
Classification: Important, Core term “Mojo”
Content: Mojo language, launched in May 2023, was initially only usable in browser-based Jupyter Notebooks. Starting September 2023, a Linux native version became available. The team at Modular also developed a Visual Studio Code extension for Mojo language. Mojo is a programming language based on the MLIR compiler framework, developed by Modular, aimed at providing a unified programming framework for AI and related software development fields.
Add to Memory Collection “Significant Events of 2023”
Add to Memory Collection “Programming Languages”
Add to Memory Collection “High Performance Programming Languages”
Add to Memory Collection “Python”

2 Likes

Yes, you are right, when you open a new “Context Window” everything will be gone and not remembered. I don’t know how our brains remember information, but sometimes we don’t remember it all, we just imagine (picture) the action at that time.

human always try to find out how our brain work, maybe that’s what makes AI develops

1 Like

Yeah one projects I have ruminating is building my own knowledge graph database containing all the states of the AI, such as emotion, time, text content of everything (for vector and keyword similarity).

The entire thing evolves over time, and the personalities shift over time too.