AMA on the 17th of December with OpenAI's API Team: Post Your Questions Here

Any plans to make prompt caching better? Specifically, allow specification of what content to cache, give it a variable name, and allow reference to the cache in subsequent API calls for a certain time duration. I know another company is doing this :slight_smile: