Persistant Chats with GPT using API

Hi friends,
i wanted to seek support on persistant chats with GPT because as in chatgpt official ui there we can talk on a specific topic contiousely and chatgpt know previous chat.
but in our api how we keep such sessions
Thank you

You just need to append the old messages to the back of the new prompt you send…

2 Likes

Thanks for support
but wouldnt this cause alot of tokens?

1 Like

Yup. Unfortunately, that’s the only way at the moment. One thing you can try to do is summarize the very old chats to get their tokens down. Good luck!

1 Like

There are two mainstream ways to achieve long-term memory at the moment:

  1. Embeddings - divide the past conversations into parts, use embeddings to search for parts that are related to the last part of the conversation and include the related parts at the beginning of the prompt.
  2. Summarization - summarize the previous parts of the conversation and include summary at the beginning of the prompt.

I suppose Langchain library can help you achieve both.

2 Likes

Thanks i appreciate your support. I hope openAi keep this in consideration for upcoming updates

This is a thread I started discussing the very same question: Chat Completion Architechture

This is the new flowchart I put together which works to do exactly what you wish: Maintain the chat context between API calls:

And a video I put together showing the process in action: SolrAI: How the chat completion process with standalone question maintains conversational context - YouTube

Good luck!

2 Likes

In my experience, not as many as you may think. And, as the chat history grows, you can always “prune” it, removing the earliest responses.

1 Like