Will Memory capabilities come to the API?

The ChatGPT memory feature is great! I want to add this capability for my users in my app but it’s not available in the API. Before developing a custom solution is this feature on the API roadmap?

I’ve done this by just making a tool for agents where they can just write memories to the end of their own instructions files, there are probably much better ways to do this, but for something simple like “Call me Billy Bob” from now on, “Always respond like a pirate”, it works just fine for me.

2 Likes

Yes, it will do mundane memory like this. However, it will not remember data points between prompts. I have been developing an API for an interactive character creator that walks a player through life choices. For it to remember what was done in a previous prompt I have to feed a summary of the prompt.

So far the best way I have found to do this is to have have a base prompt and then amend it with a summary from previous prompt. After 3-4 prompts, I have it resumerize to keep the token count down. It is really frustrating.

This basically shuts down my idea of using it in foundry to generate encounters on the fly that are believable within my game settings, with out a tons of training and large token prompts.

1 Like

I store the conversations in SQL then pull the conversations into local llm, context them and output what’s relevant to openai.

I don’t do a lot on openai, with all the outside money poured in, I’m basically letting my son play with it. I would like to see a huge benefit for developers.

I just trained an agent on postgres. I know nothing about PostgreSQL or sql commands, but my agent does and my agent can create and run commands as well as in memory execution using SQLAlchemy, works great.

You can use a truncation strategy maybe to pull the last message or X messages which may help, or another strategy. Look up turncation strategy in openai docs

A secondary option is SOLR

same here, this is what I also do