GPT-3.5-turbo: Passing previous conversation in prompt (code sample)?

I’ve got a working chatbot but it doesn’t “remember” the previous message. For example, here is the output:

User: My name is Tim.

PocketAI: Hello Tim, it’s nice to meet you. How can I assist you today?

User: What is my name?

PocketAI: I’m sorry, but as an AI language model, I don’t have access to personal information such as your name.

Obviously, I would like it to answer that it knows my name is Tim.

I’ve tried many changes with this section of code, but I can’t get it to work. Any help would be greatly appreciated!

fetch(https://api.openai.com/v1/chat/completions, {
method: ‘POST’,
headers: {
‘Content-Type’: ‘application/json’,
‘Authorization’: Bearer ${apiKey}
},
body: JSON.stringify({
model: ‘gpt-3.5-turbo’,
messages: [
{ role: ‘system’, content: promptText },
{ role: ‘user’, content: inputField.value }
],
temperature: 0.3,
max_tokens: 2000
})
})
.then(response => response.json())
.then(data => {
messages. Push({
role: ‘assistant’,
content: data.choices[0].message.content
});

that’s correct, turbo doesn’t remember anything. To make it remember previous messages, you need to pass them along with the new request.

It looks similar to this

{ role: ‘system’, content: promptText },
{ role: ‘user’, content: somepreviousquestion.value },
{ role: ‘assistant’, content: somepreviousresponse.value },
{ role: ‘user’, content: somepreviousquestion2.value },
{ role: ‘assistant’, content: somepreviousresponse2.value },
{ role: ‘user’, content: somepreviousquestion3.value },
{ role: ‘assistant’, content: somepreviousresponse3.value },
{ role: ‘user’, content: somepreviousquestion4.value },
{ role: ‘assistant’, content: somepreviousresponse4.value },
{ role: ‘user’, content: inputField.value }
],