I have a lengthy system prompt that I send to chatGPT in every conversation. Is it possible to make chatGPT remember this system prompt once, so that it doesn’t need to be sent in every subsequent conversation?
If such a method exists, would it help save tokens or reduce API usage costs in any way? Thank you.
Hi,
That is not currently an option, the models are stateless and require all context including system messages to be sent each time.
There are some delusions here. First, such a system does not (yet) exist, but mostly it would NOT SAVE YOU TOKENS because YOU PAY FOR PROCESSING.
You basically ask “I drove from A to B via C - can you not remember the route from C to B, so the car does not use gas when I take that part of the route”. You pay for processing, regardless whether they save it - it still happens. Learn, please, how AI LLM’s work.
Would it not be possible to have a system to literally just “copy” conversations from one to another? Say you have a conversation 1. where you upload an image. Then you create conversation 2, which has all the copied data from 1 (already calculated the image), which you can then run prompts on. Would that not be significantly faster instead of retokenizing?