Does prompt caching supports in the API and Assistants

I like prompt caching
https://platform.openai.com/docs/guides/prompt-caching

But it didn’t mention whether this is also applied to Playground and the Assistants API?

Title says it all

1 Like

I’m not sure, I’ve tried prompt caching on the API using gpt-4o but it didn’t work. So I guess we will find out more about it as it gets released (or we learn how to use it, if this is a “I’m not using it correctly” case).

It looks as though it is specifically turned on for Chat Completions.

https://platform.openai.com/docs/guides/prompt-caching/requirements

Though I surmise that they will release an update for the Assistants soon that includes all sorts of neat stuff. It would be super explicit if it were available.

Otherwise, caching should be supported by the Completions models (not assistants) in the playground since they’re the same endpoint as you would access programmatically.