GPT-3.5-turbo: Passing previous conversation in prompt (code sample)?

I’ve got a working chatbot but it doesn’t “remember” the previous message. For example, here is the output:

User: My name is Tim.

PocketAI: Hello Tim, it’s nice to meet you. How can I assist you today?

User: What is my name?

PocketAI: I’m sorry, but as an AI language model, I don’t have access to personal information such as your name.

Obviously, I would like it to answer that it knows my name is Tim.

I’ve tried many changes with this section of code, but I can’t get it to work. Any help would be greatly appreciated!

fetch(https://api.openai.com/v1/chat/completions, {
method: ‘POST’,
headers: {
‘Content-Type’: ‘application/json’,
‘Authorization’: Bearer ${apiKey}
},
body: JSON.stringify({
model: ‘gpt-3.5-turbo’,
messages: [
{ role: ‘system’, content: promptText },
{ role: ‘user’, content: inputField.value }
],
temperature: 0.3,
max_tokens: 2000
})
})
.then(response => response.json())
.then(data => {
messages. Push({
role: ‘assistant’,
content: data.choices[0].message.content
});

1 Like

that’s correct, turbo doesn’t remember anything. To make it remember previous messages, you need to pass them along with the new request.

It looks similar to this

{ role: ‘system’, content: promptText },
{ role: ‘user’, content: somepreviousquestion.value },
{ role: ‘assistant’, content: somepreviousresponse.value },
{ role: ‘user’, content: somepreviousquestion2.value },
{ role: ‘assistant’, content: somepreviousresponse2.value },
{ role: ‘user’, content: somepreviousquestion3.value },
{ role: ‘assistant’, content: somepreviousresponse3.value },
{ role: ‘user’, content: somepreviousquestion4.value },
{ role: ‘assistant’, content: somepreviousresponse4.value },
{ role: ‘user’, content: inputField.value }
],

3 Likes

Thank you for your reply, much appreciated.

2 Likes

could you please elaborate more how to implement this, I didn’t get it.

1 Like

I ran into this puzzle as well not too long ago.

The way I solved for it was by initializing the first system and user messages. I would then take the user input and api response and store the array as a local storage item. Each time a new question/response pair was made, this would be appended to the previous data, and then the new array passed along.

Here is my code example:

 // Messages payload
    // Check if the messages item exists in localStorage
    if (!localStorage.getItem("messages")) {
      // If it does not exist, create an array with the initial messages
      const iMessages = [
        { role: 'system', content: "You are Eva. You have access to previous chats and responses. You also have access to updated real-time news and information. You will keep conversation to a minimum and answer to the best of your abilities." },
        { role: 'user', content: selPers.value + " " + dateContents },
      ];

      // Store the initial messages in localStorage
      localStorage.setItem("messages", JSON.stringify(iMessages));
    }

    // Create a new array to store the messages
    let newMessages = [];

	//const cleanedQuestion = sQuestion.replace(/<div[^>]*>|<\/div>|&nbsp;/gi, '');
	const cleanedQuestion = sQuestion.replace(/<div[^>]*>|<\/div>|&nbsp;|<span[^>]*>|<\/span>/gi, '');


    	// Push the messages to the new array
    	newMessages.push({ role: 'assistant', content: lastResponse.replace(/\n/g, ' ') });
	    newMessages.push({ role: 'user', content: cleanedQuestion.replace(/\n/g, '') });

...

    // Append the new messages to the existing messages in localStorage
    let existingMessages = JSON.parse(localStorage.getItem("messages")) || [];
    existingMessages = existingMessages.concat(newMessages);
    localStorage.setItem("messages", JSON.stringify(existingMessages));

    // Retrieve messages from local storage
    var cStoredMessages = localStorage.getItem("messages");
    kMessages = cStoredMessages ? JSON.parse(cStoredMessages) : [];

    // API Payload
    var data = {
        model: sModel,
	messages: kMessages,
        max_tokens: iMaxTokens,
        temperature:  dTemperature,
        frequency_penalty: eFrequency_penalty,
        presence_penalty: cPresence_penalty,
	stop: hStop
    }

    // Sending API Payload
    oHttp.send(JSON.stringify(data));

It works pretty well for me :slight_smile: I hope this might help!
For the full context of the code, please feel free to take a look at the full code via github:

Chat GPT API does not “remember” your responses like when using it in the browser. To have a conversation with it, you need to constantly update and provide the previous conversation to it for context.

You achieve this by saving the previous response and adding it to the messages array every time you make a new API request / chat message.

To learn more about the basics of ChatGPT, check out the FAQ in this guide

This is great and everything, but wouldn’t this method completely shred through tokens?

Yes. Welcome to the (not sarcastic) fun game of token management

Is there any good way to implementing chat memory without it using all of my tokens in one request? :sweat_smile:

1 Like

yes, there is multiple options.
But langchain and pinecone db should be your way to go.

Have a look at langchain and long term memory setups :wink:

Hi, does this means the cost of the API will increase if I try to chat? I mean like an O(n^2)? will the previous prompt count as a billed token?