OpenAI API Conversation Memory

Hi there,

I have been experimenting with the chat gpt api’s (using node) and I noticed that when the message history get’s too big it returns an headers too large error.

I wish to implement a mechanism that would allow the chatGPT to remember in order to create a more personalized chat experience.

I was wondering, is this possible at the moment ? What can I do to implement such a feature ?

Please advise.

Thank you.

1 Like

You would ideally have your own database. However all that’s needed is your own trimming function (usually done by summarizing the conversation)

You can use any type of database - even local files, however with scaling & design philosophy in mind I’d say it’s a good idea to get comfortable with vector databases. I would recommend using Pinecone. Not only is it free, it’s lightning fast, and they are constantly implementing new features to adapt perfectly for more use cases. It’s also very nice if you’d like to perform analytics on the sessions based on their semantic relevance.

In this case, I’d set the conversation session ID as the namespace and use that for retrieval.

Thanks Ronald for the suggestion, I will have a look into Pinecone.

So exciting !

1 Like

It really is!

Again, you can start off with a simple key-value database like a json file or even an SQL database. Vector databases are wonderful though.

Hi @humbooz

You need to write some code which implements your messages array pruning, filtering and / or summarization strategy(s).

For example, see this topic:

HTH

:slight_smile:

Hey @humbooz - Curious how you’ve been making out on this topic. I want to implement this as well.