Continuous chat conversation that keeps all the past context by API

The OpenAI chat can follow the same conversation for multiple prompts while keeping the same context.

How to do the same through API so any next prompt can refer to what happened before and all of that is remembered as a context similarly as in the OpenAI chat?


Hi @myneur
That is a great question!

This board has a few good threads on how some have tried to get a memory in place, but I don’t believe anyone has yet got the exact functionality as the website. I believe part of the challenge is as conversations go on, so do the token size sent on through eventually hitting their threshold.

One way I am sort of getting around this with some of the code I’ve been dabbling in is by taking the last responses, and making that part of the prompt. Generally I can get about 4 levels deep before the AI seems to loose context.

Here is some javascript example:

var s = oJson.choices[0].text;
	    // Empty Response Handling	     
	    if (s == "") {
        	txtOutput.value += "AI: I'm sorry can you please ask me in another way?";
    	    } else {
        	txtOutput.value += "AI: " + s.trim();
	    masterOutput += "\n" + txtOutput.value + "\n";
	    localStorage.setItem("masterOutput", masterOutput);
	    lastResponse = s;
    // API Payload
    var data = {
        model: sModel,
        prompt: selPers.value + lastResponse.replace(/\n/g, '') + " " + sQuestion.replace(/\n/g, ''),
        max_tokens: iMaxTokens,
        temperature:  dTemperature,
        frequency_penalty: 0.0, 
        presence_penalty: 0.0,  
	stop: stop

No where near perfect yet, and I am sure others have way better implementations but I am super happy with the results so far :slight_smile:

If you are curious on my full code project:

1 Like

In the “messages” field when calling the api you can pass the whole conversation instead of just the last message, that way it will read the context. Lmk if you need a code snippet as an example