Chatgpt api is not able to remember conversation

Each chatgpt api request starts a new conversation.
I want to make it remember what i said in my last conversation.

There is a way to do that is to pass the whole record of previous conversation but its way too expensive in terms of tokens.

Why it does not have capability to remember things i dont get it.
If i want to make same chat application as chatgpt using api i wont be able to do it as it does not store previous conversations.

3 Likes

If you want to store conversations, you’ll have to do it on your end. Use a storage solution that best suits your needs.

1 Like

i think the only way and the best way for now is to keep the previous conversation in prompt, or make a summarization for what happened in the past, or combine both. As the ChatGPT app itself stores our last conversation in the prompt. CMIIW

so essentially if its a very long conversation, to allow it to remember and have context it would be too long a conversation to be sent in each time. And the number of tokens would grow too?

just a random thought is to have keywords/concepts kept as the conversation grows and send it over for each chatcompletion call? (obviously all these information have to be kept within our own systems).

1 Like

This is more-than-likely because the OpenAI infrastructure is not yet ready to support many millions of users and apps who need data stored during session management of the OpenAI server / cloud side.

You can do it, but as this is a developer community you need to code your own session management and chat / conversation management and storage solution.

Yes, that requires a some application coding, often a database with a table for:

  • User (if more than one user will be accessing your app)
  • Chat Topics (where you will keep the description of the chat, the chatid, etc)
  • Chat Messages (messages, where you will store the chatid, role, content and any and all API params you might want to save, for whatever reason (debugging, etc).

HTH

:slight_smile:

Suppose that I want the chatbot with gpt-3.5-turbo to have sessions for different users, meaning that each user have their own embeddings query, do you know how the logic works?

Of course. It is as simple as adding a user column to the chat conversation (or whatever table works for you) DB schema (which I posted in a different topic here recently).

Its just basic web dev ops, TBH🚀

HTH

:slight_smile:

I’ve been messing around with context and continuity with gpt-3.5-turbo. Mostly as a distraction not as something serious. I’m using a transformer to tokenize a file of mostly facts and also the dialog history. The idea is that when you type a prompt if there is a related sentence it is plucked out of the text files and included with the prompt. It works pretty well. I can chat with the bot for quite a while and then ask it what my name is and it responds correctly. The code sucks, but here it is Bitbucket

1 Like

Is it possible for ChatGPT to recall all the previous chats I had so that it can provide a personalized answer? Your effort in doing so would be greatly appreciated.

1 Like