Just started 2 days ago with ChatGPT and the OpenAI API, so please forgive me for asking stupid questions.
My understanding is that ChatGPT can remember previous prompts (and answers?), so you get a natural conversation, right?
How to achieve this with the API, using the chat/completions endpoint?
Is it prepared for that or does it only support one-shot questions?
Is the “user” field of the request JSON structure somehow designed for that?
If there is a topic in the API doc that I have missed, link is welcome.
Seb
1 Like
Yeah, you basically just send pairs of user/assistant messages for the “context/memory…” The costs do add up, though, which should make you appreciate ChatGPT more! Haha.
This is mostly (as far as I know) just for you to tie one of your users’ identification to so that if someone is sending “bad” prompts, it’s easier for you (and OpenAI) to deal with them… Not used in any other way, I’m pretty sure…
Hope this helps. 
1 Like
Hi Paul thank for your quick answer!
Just re-read the intro about chat
https://platform.openai.com/docs/guides/text-generation/chat-completions-api
Now I understand the purpose of the “assistant” messages which are there to give context / history of a conversation…
Assistant messages store previous assistant responses, but can also be written by you to give examples of desired behavior.
Right?
1 Like
Yup! Sounds like you’re on the right path.
Two other links that can help…
https://platform.openai.com/docs/quickstart
and
Good luck coding, and let us know if you need anything else.
Hope that helps!
1 Like