Sorry if the terminology is a bit off, I don’t recall seeing a glossary for them.
For a ChatGPT conversation, after the first prompt of a conversation it has been noted that ChatGPT creates a summary of the earlier part of the conversation and passes that along for the next prompt. My thought is that an actual prompt is something like:
The OpenAI prompt that does many things such as protecting ChatGPT from generating illicit information.
The summary prompt or context prompt that summarizes the previous part of the conversation. This is not created by the user, it is created by part of the processing chain after the user prompt is submitted.
The user prompt that is given by the user as the next part of the conversation.
Is there a way to get what I refer to as the summary prompt?
If others can cite references to the actual terminology for the parts, it would be appreciated.
Might sound a bit redundant but have you tried asking it directly ? “Can you summarise the previous part of the conversation fit for your use and return it to me?”
While i don’t think it would be possible to return it exactly as ChatGPT would want it to, simply because it would be akin to prompt injection, but this might get you the closest possible case.
One of the reasons I asked this question was that if GPT-4 is generating source code and the code is being parsed and the errors are given back to GPT-4, repeating until no errors, then sometimes the errors may stem from the summary prompt , this could be especially true if the conversation runs long. Thus seeing the exact summary prompt would be needed as asking for a summary in the conversation would not be accurate enough.