I don’t recall where I learnt it, but I am pretty sure that, unlike ChatGPT, the models available via the API have no memory.
However, to contradict me, the playground for the API has a chat example. And it’s not obvious how this works. I have seen other people ask how but I have not see an answer.
the context has to be part of the prompt