Does chat complation api remembers context?

Hello, I would like to clarify something for me and please forgive my lack of knowledge.

I am trying to create a chat for arrange and parse some information for my database. But I notice something it gives normal response on first message but when I send a second message related the first one, it does not remember the first one.

Do I have to put every message in the prompt?
If I do that then the token usage increases unnecessarily. Is there something that I miss or this is the working principle of the API?

The AI is stateless, it has no memory of what happened in the past, you must pass all of the prior context in order for the model to include that in it’s next call. Conversational AI interaction are an illusion created by passing all that has gone before to the model each time. You can truncate this data to keep it manageable but any reference to data now lost will cause hallucination and inaccuracies.

1 Like