How to add memory layer with chat gpt? I mean not store all memory to its context. I skim some article, there are Mem0 and RAG for memory LLM. but Can we integrate it with ChatGpt API? Or any solution? It is ok to call multiple function before giving response from ChatGpt API, no need for single call function
2 Likes
There are multiple memory backend solutions. To the best of my knowledge, ther are Zep, letta, Mem0, Memobase. I think Memobase did have a integration with OpenAI API, where you can put a user-id in the OpenAI Client.