Responses API not using cached inputs for o3-mini

I recently switched over to the Responses endpoint vs. Chat Completions for requests using o3-mini and gpt-4o-mini. I am using the exact same prompts as I did before, and the vast majority of the prompt content is effectively the same between the requests using gpt-4o-mini and o3-mini, however our cached input tokens have effectively dropped to 0 for o3-mini. I know the input should be counting as cached, because I am still seeing cached input tokens using the same data for gpt-4o-mini:

When will this issue be rectified, and will we be credited for what should be counted as cached input?