Caching prompts. -using the same prompt for multiple contexts

I need to send the same prompt but each time with different context.
Is there any method to cache the prompt and send the contexts one after another without appending the prompt every time?


Bump! Would be a great addition to my case as well.

To clarify – are you referring to using the same system prompt with different data (user prompt) in each API call?

If so, the new GPT/assistants can be accessed via the API and give you this ability.

A hack is using “threads” although that’s not a help (IMHO) because you’re still charged for the tokens and is only valuable if you need to repeat to build context.

Have you tested calling custom GPT’s or do you need help doing that?