openAI Agent - Issue with prompt caching

Prompts with more than 1024 tokens are not getting cached. I tried both gpt-4o and gpt-4.1. I am using agents SDK. I print the usage in logs and I got this -

usage=Usage(requests=1, input_tokens=1074, output_tokens=26, total_tokens=1100
usage=Usage(requests=1, input_tokens=1053, output_tokens=25, total_tokens=1078
usage=Usage(requests=1, input_tokens=1080, output_tokens=22, total_tokens=1102

cached_tokens and prompt_tokens_details are missing, so is the prompt not getting cached? How to fix it?