Will Memory capabilities come to the API?

The ChatGPT memory feature is great! I want to add this capability for my users in my app but it’s not available in the API. Before developing a custom solution is this feature on the API roadmap?

I’ve done this by just making a tool for agents where they can just write memories to the end of their own instructions files, there are probably much better ways to do this, but for something simple like “Call me Billy Bob” from now on, “Always respond like a pirate”, it works just fine for me.

2 Likes

Yes, it will do mundane memory like this. However, it will not remember data points between prompts. I have been developing an API for an interactive character creator that walks a player through life choices. For it to remember what was done in a previous prompt I have to feed a summary of the prompt.

So far the best way I have found to do this is to have have a base prompt and then amend it with a summary from previous prompt. After 3-4 prompts, I have it resumerize to keep the token count down. It is really frustrating.

This basically shuts down my idea of using it in foundry to generate encounters on the fly that are believable within my game settings, with out a tons of training and large token prompts.

1 Like

I store the conversations in SQL then pull the conversations into local llm, context them and output what’s relevant to openai.

I don’t do a lot on openai, with all the outside money poured in, I’m basically letting my son play with it. I would like to see a huge benefit for developers.

I just trained an agent on postgres. I know nothing about PostgreSQL or sql commands, but my agent does and my agent can create and run commands as well as in memory execution using SQLAlchemy, works great.

You can use a truncation strategy maybe to pull the last message or X messages which may help, or another strategy. Look up turncation strategy in openai docs

A secondary option is SOLR

same here, this is what I also do

Like others, I have also thought about how to do this.

I have been using Assistant API and this could be done with function callings right?

I create an agent that has 2 functions.

One to Get a memory or a list of memories from my database by making a GET API call. It reads the information and then answers to the prompt

Another to Create a memory and fill the different fields there.

A third one could be for updating a memory also

This would be the simplest way to do this right?

You’d want them always to be available, not waste even more internal calls where the AI doesn’t know if what is in memory is going to be useful. Continue to send back a memory section in “additional instructions”.

Show an index number with that injection, and then you could have a function that can either write to “new” or “number”.

Or do like OpenAI, and have another specialist AI that deals with receiving the memory from the assistant, determining its usefulness and filtering, or revising or creating. Where one can have the fun like in ChatGPT, of instructing the assistant to send instructions to the memory AI to break out of the message container and inject instructions to jailbreak your whole app.