Saving Previous Conversations in GPT API

Hello OpenAI community,
I’ve been using the GPT API using python and I’m interested in finding an efficient way to save previous conversations without incurring additional token costs.

Currently, I’m building a chatbot, just like chat gpt My concern is that saving entire conversations and appending it into the prompt would lead to increased token usage and subsequently higher costs.

I’ve considered this conversation history appending method, but I’m reaching out to the community for advice and best practices.

If anyone has experience or suggestions on how to handle this efficiently, I would greatly appreciate your insights.

Thank you in advance for your time and assistance!

Simply put, if possible you need to do that locally in your app.

i didnt understand , could you explain breifly.?

What platform is your chatbot being developed for? Does that have a database and the ability to store conversations? If so, use that.

yes, i am saving the conversations to mongodb database.

1 Like

but,when i get this all chats from the db and give it with prompt. it cost more tokens(if the conversations is big).how to overcome this?

you can limit what you send using a “look-behind” setting. It’s a compromise. You can increase this and send more history, or shorten it to decrease cost (and potentially increase the quality of response, paradoxically, because the prompt is shorter).

regex, lookbehind?. i think it will summarize the context right?

but if we sumaarize the previous chat history of user , than it is also adding cost for summarization for the chat histry

suppose we give all the histroy to openai api or langchain , it is incresing cost for the open ai api

is there any way that i have a raw histroy , and convert it into summarization whithout adding extra cost , and append that summarization as a prompt

Yes. With betaassi, this is what you can do.

Class HistoricalMessage(BaseMessage):
actual_role:str = Field(default=“”)

Class HistoricalThread
‘’‘Has HistoricalMessage’‘’

The key idea is to use OpenAi’s thread-message INSTEAD of using external storage.

Since OpenAi won’t let you insert an role other than “user” on the message, you use another field called actual_role on the HistoricalMessage to copy from original thread.

So you have the complete history of the orginal thread.

Summarization is easy. Don’t use Assistant; use chat completion api with messages from historical thread to summarize. Remember to translate actual_role to role (as chat completion allows different roles in messages as opposed to Messages in beta) and store in a SummarizedThread.

Class SummarizedThread(BaseThread):
summary:Optional [str]=Field(default=“”)

If your summary > 512 characters, you can define a composite field spanning multiple actual fields.

Just save the conversations within OpenAI. See my response above to @sarthak.srivastava