Is there a way to save tokens when dealing with longer prompts in each conversation?

I have a lengthy system prompt that I send to chatGPT in every conversation. Is it possible to make chatGPT remember this system prompt once, so that it doesn’t need to be sent in every subsequent conversation?
If such a method exists, would it help save tokens or reduce API usage costs in any way? Thank you.


That is not currently an option, the models are stateless and require all context including system messages to be sent each time.

1 Like

There are some delusions here. First, such a system does not (yet) exist, but mostly it would NOT SAVE YOU TOKENS because YOU PAY FOR PROCESSING.
You basically ask “I drove from A to B via C - can you not remember the route from C to B, so the car does not use gas when I take that part of the route”. You pay for processing, regardless whether they save it - it still happens. Learn, please, how AI LLM’s work.

1 Like