Is prompt caching compatible with End-user ID's?

Greetings, everyone. I am endeavoring to determine what is hindering my application from fully utilizing cached inputs. I have an extensive prompt that remains unchanged with each request. As this is my inaugural live application, I am employing end-user IDs for every call, as recommended in https://platform.openai.com/docs/guides/safety-best-practices#end-user-ids. Could this be the underlying issue? Or maybe metadata i’m adding to the request ?
Thanks for your time an replies.

User IDs are actually conducive to hitting cached prompts, according to the prompt caching guide:

  • If you provide the user parameter, it is combined with the prefix hash, allowing you to influence routing and improve cache hit rates. This is especially beneficial when many requests share long, common prefixes.

To hit cached prompts, make sure that the requirements are satisfied.

Thanks for your quick feedback. Yes, I have already reviewed the requirements section. At present, I am analyzing the logs, and it appears that the user ID is correlated with both cached and non-cached behaviors, as well as minor variations in the metadata. Please note that I am also transmitting the user ID within the metadata, so it is possible that the issue originates solely from the metadata. Unfortunately, I have not been able to find any information regarding this matter.