Is there any way to save previous conversation as history? so that they dont count history as token everytime i send request?

Currently this is how im sending prompt. But looks like they counting token for the preset messages everytime i send request.

const response = await openai.createChatCompletion({
      model: "gpt-3.5-turbo",
      messages: [
        { role: "system", content: "You are a helpful assistant." },
        { role: "user", content: "Who won the world series in 2020?" },
          role: "assistant",
          content: "The Los Angeles Dodgers won the World Series in 2020.",
        { role: "user", content: "Where was it played?" },
        { role: "user", content: "can you repeat the first question?" },
        {role:"user", content:req. body.message}


LLMs are stateless/memoryless models. It can only ever know what you include in context.

1 Like

Yep that’s how it works. Every time you send an extra message, you have to send entirety of the conversation so far.

1 Like

Or at least the portion of it that is relevant to the new message.
some ppl try to use embeddings for that. YMMV

1 Like